guide:149493cefc: Difference between revisions
No edit summary |
mNo edit summary |
||
Line 213: | Line 213: | ||
</math> | </math> | ||
Now define <math>\mathcal{G}:=\sigma(S_n,\xi_{n+1},\xi_{n+2},...)</math>. Then we have <math>\mathcal{G}_{n+1}\subset\mathcal{G}_n</math> because <math>S_{n+1}=S_n+\xi_{n+1}</math>. Hence, it follows that <math>\E[\xi_1\mid \mathcal{G}_n]=\E[\xi_1\mid S_n]=\frac{S_n}{n}</math> converges a.s. and in <math>L^1</math> to some r.v., but Kolmogorov's 0-1 law implies that this limit is a.s. constant. In particular, <math>\E\left[\frac{S_n}{n}\right]=\E[\xi_1]</math> converges in <math>L^1</math> to this limit, which is thus <math>\E[\xi_1]</math>.}} | Now define <math>\mathcal{G}:=\sigma(S_n,\xi_{n+1},\xi_{n+2},...)</math>. Then we have <math>\mathcal{G}_{n+1}\subset\mathcal{G}_n</math> because <math>S_{n+1}=S_n+\xi_{n+1}</math>. Hence, it follows that <math>\E[\xi_1\mid \mathcal{G}_n]=\E[\xi_1\mid S_n]=\frac{S_n}{n}</math> converges a.s. and in <math>L^1</math> to some r.v., but Kolmogorov's 0-1 law implies that this limit is a.s. constant. In particular, <math>\E\left[\frac{S_n}{n}\right]=\E[\xi_1]</math> converges in <math>L^1</math> to this limit, which is thus <math>\E[\xi_1]</math>.}} | ||
===Martingales bounded in <math>L^2</math> and random series=== | ===Martingales bounded in <math>L^2</math> and random series=== | ||
Latest revision as of 22:55, 8 May 2024
Backward Martingales and the law of large numbers
Let [math](\Omega,\F,\p)[/math] be a probability space. A backward filtration is a family [math](\F_n)_{n\in-\N}[/math] of [math]\sigma[/math]-Algebras indexed by the negative integers, which we will denote by [math](\F_n)_{n\leq 0}[/math], such that for all [math]n\leq m\leq 0[/math] we have
We will write
Let [math](\Omega,\F,(\F_n)_{n\leq 0},\p)[/math] be a backward filtered probability space. Let [math](X_n)_{n\leq 0}[/math] be a backward supermartingale. Assume that
Then [math](X_n)_{n\leq 0}[/math] is u.i. and converges a.s. and in [math]L^1[/math] to [math]X_{-\infty}[/math] as [math]n\to-\infty[/math]. Moreover, for all [math]n\leq 0[/math], we have
First we show a.s. convergence. Let therefore [math]k\geq 1[/math] be a fixed integer. For [math]n\in\{1,...,k\}[/math], let [math]Y_n^k=X_{n-k}[/math] and [math]\mathcal{G}_n^k=\F_{n-k}[/math]. For [math]n \gt k[/math], we take [math]Y_n^k=X_0[/math] and [math]\mathcal{G}_n^k=\F_0[/math]. Then [math](Y_n^k)_{n\geq 0}[/math] is a supermartingale with respect to [math](\mathcal{G}_n^k)_{n\geq 0}[/math]. We now apply Doob's upcrossing inequality to the submartingale [math](-Y_n^k)_{n\geq 0}[/math] to obtain that for [math]a \lt b[/math]
With monotone convergence we get
Next, we observe that
Equation [math](1)[/math] is always satisfied for backward martingales. Indeed, for all [math]n\leq 0[/math] we get
Let [math](\Omega,\F,\p)[/math] be a probability space. Let [math]Z[/math] be a r.v. in [math]L^1(\Omega,\F,\p)[/math] and let [math](\mathcal{G}_n)_{n\geq 0}[/math] be a decreasing family of [math]\sigma[/math]-Algebras. Then
For [math]n\geq 0[/math] define [math]X_{-n}:=\E[Z\mid \F_n][/math], where [math]\F_{-n}=\mathcal{G}_n[/math]. Then [math](X_n)_{n\leq 0}[/math] is a backward martingale with respect to [math](\F_n)_{n\leq 0}[/math]. Hence theorem 14.1. implies that [math](X_n)_{n\leq 0}[/math] converges a.s. and in [math]L^1[/math] for [math]n\to\infty[/math]. Moreover[a],
Let [math](\Omega,\F,\p)[/math] be a probability space. Let [math](X_n)_{n\geq 1}[/math] be a sequence of independent r.v.'s with values in arbitrary measure spaces. For [math]n\geq 1[/math], define the [math]\sigma[/math]-Algebra
The tail [math]\sigma[/math]-Algebra [math]\B_\infty[/math] is defined as
Then [math]\B_\infty[/math] is trivial in the sense that for all [math]B\in \B_\infty[/math] we have [math]\p[B]\in\{0,1\}[/math].
This proof can be found in the stochastics I notes.
Let [math](\Omega,\F,\p)[/math] be a probability space. Let [math]Z\in L^1(\Omega,\F,\p)[/math] and [math]\mathcal{H}_1[/math] and [math]\mathcal{H}_2[/math] two [math]\sigma[/math]-Algebras included in [math]\F[/math]. Assume that [math]\mathcal{H}_2[/math] is independent of [math]\sigma(Z)\lor \mathcal{H}_1[/math]. Then
Let [math]A\in \mathcal{H}_1\lor \mathcal{H}_2[/math] such that [math]A=B\cap C[/math], where [math]B\in \mathcal{H}_1[/math] and [math]C\in\mathcal{H}_2[/math]. Then
Now we note that
Let [math](\Omega,\F,\p)[/math] be a probability space. Let [math](\xi_n)_{n\geq 1}[/math] be a sequence of iid r.v.'s such that for all [math]n\geq 1[/math] we have [math]\E[\vert \xi_n\vert] \lt \infty[/math]. Moreover, let [math]S_0=0[/math] and [math]S_n=\sum_{j=1}^n\xi_j[/math]. Then
At first, we want to show that [math]\E[\xi_1\mid S_n]=\frac{S_n}{n}[/math]. Indeed, we know that there is a measurable map [math]g[/math] such that
Martingales bounded in [math]L^2[/math] and random series
Let [math](\Omega,\F,(\F_n)_{n\geq 0},\p)[/math] be a filtered probability space. Let [math](M_n)_{n\geq 0}[/math] be a martingale in [math]L^2(\Omega,\F,(\F_n)_{n\geq 0},\p)[/math], i.e. [math]\E[M_n^2] \lt \infty[/math] for all [math]n\geq 0[/math]. We say that [math](M_n)_{n\geq 0}[/math] is bounded in [math]L^2(\Omega,\F,(\F_n)_{n\geq 0},\p)[/math] if [math]\sup_{n\geq 0}\E[M_n^2] \lt \infty[/math]. For [math]n\leq \nu[/math], we have that [math]\E[M_\nu\mid \F_n]=M_n[/math] implies that [math](M_\nu-M_n)[/math] is orthogonal to [math]L^2(\Omega,\F,(\F_n)_{n\geq 0},\p)[/math]. Hence, for all [math]s\leq t\leq n\leq \nu[/math], [math](M_\nu-M_n)[/math] is orthogonal to [math](M_t-M_s)[/math].
Now write [math]M_n=M_0+\sum_{k=1}^n(M_k-M_{k-1})[/math]. [math]M_n[/math] is then a sum of orthogonal terms and therefore
Let [math](M_n)_{n\geq 0}[/math] be a martingale in [math]L^2(\Omega,\F,(\F_n)_{n\geq 0},\p)[/math]. Then [math](M_n)_{n\geq 0}[/math] is bounded in [math]L^2(\Omega,\F,(\F_n)_{n\geq 0},\p)[/math] if and only if [math]\sum_{k\geq 1}\E[(M_k-M_{k-1})^2] \lt \infty[/math] and in this case
Suppose that [math](X_n)_{n\geq 1}[/math] is a sequence of independent r.v.'s such that for all [math]k\geq 1[/math], [math]\E[X_k]=0[/math] and [math]\sigma_k^2=Var(X_k) \lt \infty[/math]. Then
- [math]\sum_{k\geq1}\sigma^2_k \lt \infty[/math] implies that [math]\sum_{k\geq 1}X_k[/math] converges a.s.
- If there is a [math]C \gt 0[/math] such that for all [math]\omega\in\Omega[/math] and [math]k\geq1[/math], [math]\vert X_k(\omega)\vert\leq C[/math], then [math]\sum_{k\geq 1}X_k[/math] converges a.s. implies that [math]\sum_{k\geq 1}\sigma_k^2 \lt \infty[/math].
Consider [math]\F_n=\sigma(X_1,...,x_n)[/math] with [math]F_0=\{\varnothing,\Omega\}[/math], [math]S_n=\sum_{j=1}^nX_j[/math] with [math]S_0=0[/math] and [math]A_n=\sum_{k=1}^n\sigma_k^2[/math] with [math]A_0=0[/math]. Moreover, set [math]M_n=S_n^2-A_n[/math]. Then [math](S_n)_{n\geq 0}[/math] is a martingale and
which implies that [math](M_n)_{n\geq 0}[/math] is a martingale. Let [math]T:=\inf\{n\in\N\mid \vert S_n\vert \gt \alpha\}[/math] for some constant [math]\alpha[/math]. Then [math]T[/math] is a stopping time. [math](M_{n\land T})_{n\geq 1}[/math] is a martingale and hence
Example
Let [math](a_n)_{n\geq 1}[/math] be a sequence of real numbers and let [math](\xi_n)_{n\geq 1}[/math] be iid r.v.'s with [math]\p[\xi=\pm 1]=\frac{1}{2}[/math]. Then [math]\sum_{n\geq 1}a_n\xi_n[/math] converges a.s. if and only if [math]\sum_{n\geq 1}a_n^2 \lt \infty[/math]. Indeed, we get [math]\vert a_n\xi_n\vert=\vert a_n\vert\xrightarrow{n\to\infty}0[/math] and therefore there exists a [math]C \gt 0[/math] such that for all [math]n\geq 1[/math], [math]\vert a_n\vert \leq C[/math]. Now for a r.v. [math]X[/math] recall that we write [math]\Phi_X(t)=\E\left[e^{itX}\right][/math]. We also know
Moreover, define [math]R_n(x)=e^{ix}-\sum_{k=0}^n\frac{i^kx^k}{k!}[/math]. Therefore we get
Indeed, [math]\vert R_0(x)\vert=\vert e^{ix}-1\vert=\left\vert\int_0^x ie^{iy}dy\right\vert\leq \min(2,\vert x\vert).[/math] Moreover, we have [math]\vert R_n(x)\vert=\left\vert \int_0^x iR_{n-1}(y)dy\right\vert[/math]. Hence the claim follows by a simple induction on [math]n[/math]. If [math]X[/math] is such that [math]\E[X]=0[/math], [math]\E[X^2]=\sigma^2 \lt \infty[/math] and [math]e^{itX}-\left( 1+itX-\frac{t^2X^2}{2}\right)=R_2(tX)[/math] we get
and [math]\E[R_2(tX)]\leq t^2\E[\vert X\vert^2\land tX^3][/math]. With dominated convergence it follows that [math]\Phi(t)=1-\frac{t^2\sigma^2}{2}+o(t^2)[/math] as [math]t\to 0[/math].
Let [math](\Omega,\F,(\F_n)_{n\geq 0},\p)[/math] be a filtered probability space. Let [math](X_n)_{n\geq 0}[/math] be a sequence of independent r.v.'s bounded by [math]k \gt 0[/math]. Then if [math]\sum_{n\geq 0}X_n[/math] converges a.s., [math]\sum_{n\geq 0}\E[X_n][/math] and [math]\sum_{n\geq 0}Var(X_n)[/math] both converge.
If [math]Z[/math] is a r.v. such that [math]\vert Z\vert \lt k[/math], [math]\E[Z]=0[/math] and [math]\sigma^2=Var(Z) \lt \infty[/math], then for [math]\vert t\vert\leq \frac{1}{k}[/math] we get
Let [math](\Omega,\F,\p)[/math] be a probability space. Let [math](X_n)_{n\geq 0}[/math] be a sequence of independent r.v.'s. Then [math]\sum_{n\geq 0}X_n[/math] converges a.s. if and only if for some [math]k \gt 0[/math] (then for every [math]k \gt 0[/math]) the following properties hold.
- [math]\sum_{n\geq 0}\p[\vert X_n\vert \gt k] \lt \infty[/math]
- [math]\sum_{n\geq 0}\E\left[X_n^{(k)}\right][/math] converges, where [math]X_n^{(k)}=X_n\one_{\{\vert X_n\vert\leq k\}}[/math]
- [math]\sum_{n\geq 0}Var\left(X_n^{(k)}\right) \lt \infty[/math]
Suppose that for some [math]k \gt 0[/math], [math](i)[/math], [math](ii)[/math] and [math](iii)[/math] hold. Then
Suppose [math](b_n)_{n\geq 1}[/math] is a sequence of strictly positive real numbers with [math]b_n\uparrow\infty[/math] as [math]n\to\infty[/math]. Let [math](v_n)_{n\geq 1}[/math] be a sequence of real numbers such that [math]v_n\xrightarrow{n\to\infty}v[/math]. Then
Note that
Now we only have to choose [math]N[/math] such that [math]n\geq N[/math] and [math]\vert v_k-v\vert \lt \varepsilon[/math] for any [math]\varepsilon \gt 0[/math].
Let [math](b_n)_{n\geq 1}[/math] be a sequence of real numbers, strictly positive with [math]b_n\uparrow \infty[/math] as [math]n\to\infty[/math]. Let [math](x_n)_{n\geq 1}[/math] be a sequence of real numbers. Then if [math]\sum_{n\geq 1}\frac{x_n}{b_n}[/math] converges, we get that
Let [math]v_n=\sum_{k=1}^n\frac{x_n}{b_n}[/math] and [math]v=\lim_{n\to\infty}v_n[/math]. Then [math]v_n-v_{n-1}=\frac{x_n}{b_n}[/math]. Moreover, we note that
Let [math](\Omega,\F,\p)[/math] be a probability space. Let [math](w_n)_{n\geq 1}[/math] be a sequence of r.v.’s such that [math]\E[w_n]=0[/math] for all [math]n\geq 1[/math] and [math]\sum_{n\geq 1}\frac{Var(w_n)}{n^2} \lt \infty[/math]. Then
Exercise.[c]
Let [math](\Omega,\F,\p)[/math] be a probability space. Let [math](X_n)_{n\geq 1}[/math] be independent and non-negative r.v.'s such that [math]\E[X_n]=1[/math] for all [math]n\geq 1[/math]. Define [math]M_0=1[/math] and for [math]n\in\N[/math], let
- [math]\E[M_\infty]=1[/math].
- [math]M_n\xrightarrow{n\to\infty\atop L^1}M_\infty[/math].
- [math](M_n)_{n\geq 1}[/math] is u.i.
- [math]\prod_{n}a_n \gt 0[/math], where [math]0 \lt a_n=\E[X_n^{1/2}]\leq 1[/math].
- [math]\sum_{n}(1-a_n) \lt \infty[/math].
Moreover, if one of the following (then every one) statements hold, then
Exercise.
A martingale central limit theorem
Let [math](\Omega,\F,(\F_n)_{n\geq 0},\p)[/math] be a filtered probability space. Let [math](X_n)_{n\geq 0}[/math] be a sequence of real valued r.v.'s such that for all [math]n\geq 1[/math]
- [math]\E[X_n\mid \F_{n-1}]=0[/math].
- [math]\E[X_n^2\mid \F_{n-1}]=1[/math].
- [math]\E[\vert X_n\vert^3\mid \F_{n-1}]\leq k \lt \infty[/math].
Let [math]S_n=\sum_{j=1}^nX_j[/math]. Then
Define [math]\Phi_{n,j}(u)=\E\left[e^{iu\frac{X_j}{\sqrt{n}}}\mid \F_{j-1}\right][/math]. A Taylor expansion yields
General references
Moshayedi, Nima (2020). "Lectures on Probability Theory". arXiv:2010.16280 [math.PR].