Revision as of 01:53, 8 May 2024 by Bot
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

The Characteristic function

[math] \newcommand{\R}{\mathbb{R}} \newcommand{\A}{\mathcal{A}} \newcommand{\B}{\mathcal{B}} \newcommand{\N}{\mathbb{N}} \newcommand{\C}{\mathbb{C}} \newcommand{\Rbar}{\overline{\mathbb{R}}} \newcommand{\Bbar}{\overline{\mathcal{B}}} \newcommand{\Q}{\mathbb{Q}} \newcommand{\E}{\mathbb{E}} \newcommand{\p}{\mathbb{P}} \newcommand{\one}{\mathds{1}} \newcommand{\0}{\mathcal{O}} \newcommand{\mat}{\textnormal{Mat}} \newcommand{\sign}{\textnormal{sign}} \newcommand{\CP}{\mathcal{P}} \newcommand{\CT}{\mathcal{T}} \newcommand{\CY}{\mathcal{Y}} \newcommand{\F}{\mathcal{F}} \newcommand{\mathds}{\mathbb}[/math]
Definition (Characteristic function)

Let [math](\Omega,\A,\p)[/math] be a probability space. Let [math]X[/math] be a r.v. with values in [math]\R^d[/math], i.e. [math]X:(\Omega,\A,\p)\to\R^d[/math]. Then we can look at the characteristic function of [math]X[/math], which is given by the Fourier transform

[[math]] \Phi_X:\R^d\to\C,\xi\mapsto \Phi_X(\xi)=\E\left[e^{i\langle \xi,X\rangle}\right]=\int_{\R^d}e^{i\langle \xi, x\rangle}d\p_X(x) [[/math]]
where [math]\langle\xi,X\rangle=\sum_{k=1}^d\xi_kX_k[/math].

For [math]d=1[/math] and [math]\xi\in\R[/math], we get

[[math]] \Phi_X(\xi)=\E\left[e^{i\xi X}\right]=\int_\R e^{i\xi x}d\p_X(x). [[/math]]

[math]\Phi_X(\xi)[/math] is continuous on [math]\R^d[/math] and bounded. For boundedness note

[[math]] \vert\Phi_X(\xi)\vert=\left\vert\E\left[e^{i\langle\xi,X\rangle}\right]\right\vert\leq \E\left[\underbrace{\left\vert e^{i\langle\xi,X\rangle}\right\vert}_{=1}\right]\leq 1. [[/math]]
Moreover, we know that [math]e^{i\langle\xi,x\rangle}[/math] is a continuous function of [math]\xi\in\R^d[/math] for every [math]x\in\R^d[/math].

[[math]] \left\vert e^{i\langle\xi,X\rangle}\right\vert\leq 1,\E[1]=1 \lt \infty. [[/math]]
So it follows that [math]\Phi_X(\xi)[/math] is a continous function of [math]\xi[/math].

Theorem

The characteristic function uniquely characterizes probability distributions, meaning that for two r.v.'s [math]X[/math] and [math]Y[/math] satisfying

[[math]] \Phi_X(\xi)=\Phi_Y(\xi), [[/math]]
for all [math]\xi\in\R^d[/math], we get that

[[math]] \p_X=\p_Y. [[/math]]


Show Proof

No proof here.

Lemma

Let [math]X[/math] be a r.v. which is [math]\mathcal{N}(0,\sigma^2)[/math] distributed. Then

[[math]] \Phi_X(\xi)=\exp\left(-\frac{\sigma^2\xi^2}{2}\right),\xi\in\R. [[/math]]


Show Proof

According to the formula we get

[[math]] \Phi_X(\xi)=\int_{-\infty}^{\infty}e^{i\xi x}e^{-\frac{x^2}{2\sigma^2}}\frac{dx}{\sigma\sqrt{2\pi}}. [[/math]]
Assume for simplicity [math]\sigma=1[/math] (change of variables: [math]\xi=\frac{x}{\sigma}[/math]). Therefore we have

[[math]] \Phi_X(\xi)=\int_{-\infty}^\infty\frac{1}{\sqrt{2\pi}} e^{-\frac{x^2}{2}}\cos(\xi x)dx+\underbrace{i\int_{-\infty}^\infty\frac{1}{\sqrt{2\pi}}e^{-\frac{x^2}{2}}\sin(\xi x)dx}_{0,\text{by parity}}=\int_{-\infty}^\infty \frac{1}{\sqrt{2\pi}}e^{-\frac{x^2}{2}}\cos(\xi x)dx. [[/math]]
Hence we have

[[math]] \frac{d\Phi_X}{d\xi}(\xi)=-\int_{-\infty}^{\infty}\frac{1}{\sqrt{2\pi}}xe^{-\frac{x^2}{2}}\sin(\xi x)dx. [[/math]]
We have used the fact that [math]e^{-\frac{x^2}{2}}\cos(\xi x)[/math] is [math]C^\infty[/math] in both variables and that [math]\left\vert x\sin(\xi x)e^{-\frac{x^2}{2}}\right\vert\leq \vert x\vert e^{-\frac{x^2}{2}}[/math], which is integrable on [math]\R[/math]. Using integration by parts we get

[[math]] \frac{d\Phi_X}{d\xi}(\xi)=\underbrace{\left[\frac{1}{\sqrt{2\pi}}e^{-\frac{x^2}{2}}\sin(\xi x)\right]_{-\infty}^\infty}_{=0}-\xi\int_{-\infty}^\infty \frac{1}{\sqrt{2\pi}} e^{-\frac{x^2}{2}}\cos(\xi x)dx. [[/math]]
So we have the following Cauchy-problem

[[math]] \begin{cases}\frac{d\Phi_X}{d\xi}(\xi)=-\xi \Phi_X(\xi)\\ \Phi_X(0)=1\end{cases} [[/math]]
Solving the differential equation, we get

[[math]] \Phi_X(\xi)=e^{-\frac{-\xi^2}{2}}. [[/math]]

Proposition

Let [math]X=(X_1,...,X_d)\in\R^d[/math] such that [math]\E[\vert X\vert^2] \lt \infty[/math], where [math]\vert\cdot\vert[/math] denotes the euclidean norm. Then

[[math]] \lim_{\vert\xi\vert\to0}\Phi_X(\xi)=1+i\sum_{j=1}^d\E[X_j]\xi_j-\frac{1}{2}\sum_{j=1}^d\sum_{k=1}^d\xi_j\xi_k\E[X_jX_k]+o(\vert\xi\vert^2), [[/math]]


Show Proof

Note that we can write

[[math]] \frac{\partial\Phi_X(\xi)}{\partial\xi_j}=i\E\left[X_je^{i\langle\xi,X\rangle}\right]. [[/math]]
This follows from the differentiation under the integral sign with [math]\left\vert X_je^{i\langle \xi,X\rangle}\right\vert\leq \vert X_j\vert[/math], which is integrable. Since

[[math]] \E[\vert X_iX_k\vert]\leq \E[\vert X_j\vert^2]^\frac{1}{2}\E[\vert X_k\vert^2]^{\frac{1}{2}} \lt \infty, [[/math]]
we have

[[math]] \frac{\partial^2\Phi_X(\xi)}{\partial\xi_j\partial \xi_k}=-\E\left[\underbrace{X_jX_ke^{i\langle\xi,X\rangle}}_{\leq \vert X_jX_k\vert\in L^1}\right]. [[/math]]
Taking [math]\xi=0[/math] we get that [math]\frac{\partial\Phi_X}{\partial\xi_j}(0)=i\E[X_j][/math] and [math]\frac{\partial^2\Phi_X}{\partial\xi_j\partial\xi_k}(0)=-\E[X_jX_k][/math]. we see that the equation in the proposition is the taylor-expansion at order 2 near 0 of the [math]C^2[/math]-function [math]\Phi_X(\xi)[/math].

From the proof we see that when [math]d=1[/math], we have

[[math]] \E[\vert X\vert] \lt \infty\Longrightarrow\E[X]=i\Phi_X'(0)\text{and}\E[X^2] \lt \infty\Longrightarrow\E[X^2]=-\Phi_X''(0). [[/math]]

General references

Moshayedi, Nima (2020). "Lectures on Probability Theory". arXiv:2010.16280 [math.PR].