guide:781736e534: Difference between revisions

From Stochiki
No edit summary
 
No edit summary
 
Line 1: Line 1:
<div class="d-none"><math>
\newcommand{\R}{\mathbb{R}}
\newcommand{\A}{\mathcal{A}}
\newcommand{\B}{\mathcal{B}}
\newcommand{\N}{\mathbb{N}}
\newcommand{\C}{\mathbb{C}}
\newcommand{\Rbar}{\overline{\mathbb{R}}}
\newcommand{\Bbar}{\overline{\mathcal{B}}}
\newcommand{\Q}{\mathbb{Q}}
\newcommand{\E}{\mathbb{E}}
\newcommand{\p}{\mathbb{P}}
\newcommand{\one}{\mathds{1}}
\newcommand{\0}{\mathcal{O}}
\newcommand{\mat}{\textnormal{Mat}}
\newcommand{\sign}{\textnormal{sign}}
\newcommand{\CP}{\mathcal{P}}
\newcommand{\CT}{\mathcal{T}}
\newcommand{\CY}{\mathcal{Y}}
\newcommand{\F}{\mathcal{F}}
\newcommand{\mathds}{\mathbb}</math></div>
{{proofcard|Proposition|prop-1|Let <math>(\Omega,\A,\p)</math> be a probability space. Let <math>(X_n)_{n\geq 1}</math> be a sequence of r.v.'s and assume that for all <math>\epsilon > 0</math> we have


<math display="block">
\sum_{n\geq 1}\p[\vert X_n-X\vert  > \epsilon] < \infty.
</math>
Then
<math display="block">
\lim_{n\to\infty\atop a.s.}X_n=X.
</math>
|Take <math>\epsilon_k=\frac{1}{k}</math> for <math>k\in\N</math> with <math>k\geq 1</math>. Now with the Borel-Cantelli lemma we get
<math display="block">
\p\left[\limsup_n\left\{\vert X_n-X\vert > \frac{1}{k}\right\}\right]=0,
</math>
which implies that <math>\p\left[\bigcup_{k\geq 1}\limsup_n\left\{\vert X_n-X\vert > \frac{1}{k}\right\}\right]=0</math> and hence
<math display="block">
\p\left[\underbrace{\bigcap_{k\geq 1}\liminf_n\left\{\vert X_n-X\vert\leq \frac{1}{k}\right\}}_{\Omega'}\right]=1.
</math>
Moreover, we have that <math>\p[\Omega']=1</math> and for <math>\omega\in\Omega'</math> we get that for all <math>k\geq 1</math> there is  <math>n_0(\omega)\in\N\setminus\{0\}</math> such that for <math>n\geq n_0(\omega)</math> we get that <math>\vert X_n(\omega)-X(\omega)\vert\leq \frac{1}{k}</math>, i.e. <math>\lim_{n\to\infty}X_n(\omega)=X(\omega)</math> for <math>\omega\in\Omega'</math>.}}
'''Example'''
Let <math>(\Omega,\A,\p)</math> be a probability space. Let <math>(X_n)_{n\geq 1}</math> be a sequence of r.v.'s such that <math>\p[X_n=0]=1-\frac{1}{1+n^2}</math> and <math>\p[X_n=1]=\frac{1}{1+n^2}</math>. Then for all <math>\epsilon > 0</math> we get <math>\p[\vert X_n\vert > \epsilon]=\p[X_n > \epsilon]=\frac{1}{1+n^2}</math>, so it follows
<math display="block">
\sum_{n\geq 1}\p[\vert X_n\vert > \epsilon] < \infty,
</math>
which implies that <math>\lim_{n\to\infty\atop a.s.}X_n=0.</math>
{{proofcard|Proposition|prop-2|Let <math>(\Omega,\A,\p)</math> be a probability space. Let <math>(X_n)_{n\geq 1}</math> be a sequence of r.v.'s. Then
<math display="block">
\lim_{n\to\infty\atop a.s.}X_n=X\Longleftrightarrow \lim_{n\to\infty\atop \p}\sup_{m > n}\vert X_m-X\vert=0.
</math>
|Exercise.}}
'''Example'''
Let <math>(Y_n)_{n\geq 1}</math> be iid r.v.'s such that <math>\p[Y_n\leq X]=1-\frac{1}{1+X}</math> for <math>X\geq 0</math> and <math>n\geq 1</math>. Take <math>X_n=\frac{Y_n}{n}</math> and let <math>\epsilon > 0</math>. Then
<math display="block">
\p[\vert X_n\vert > \epsilon]=\p[\vert Y_n\vert > n\epsilon]=\frac{1}{1+n\epsilon}\xrightarrow{n\to\infty}0,
</math>
and thus <math>\lim_{n\to\infty\atop \p}X_n=0</math>. Moreover, we have
<math display="block">
\p\left[\sup_{m\geq n}\vert X_m\vert > \epsilon\right]=1-\p\left[\sup_{m\geq n}\vert X_n\vert\leq \epsilon\right]=1-\prod_{m\geq n}^\infty\left(1-\frac{1}{1+m\epsilon}\right),
</math>
but <math>\prod_{m\geq n}^\infty\left(1-\frac{1}{1+m\epsilon}\right)=0</math>. Hence <math>\p[\sup_{m\geq n}\vert X_n\vert > \epsilon]\not\rightarrow 0</math> as <math>n\to\infty</math> and therefore <math>(X_n)_{n\geq 1}</math> doesn't converge a.s. to <math>X</math>.
{{proofcard|Lemma|lem-1|Let <math>(\Omega,\A,\p)</math> be a probability space. Let <math>(X_n)_{n\geq 1}</math> be a sequence of r.v.'s. Then <math>\lim_{n\to\infty\atop \p}X_n=X</math> if and only if for very subsequence of <math>(X_n)_{n\geq 1}</math>, there exists a further subsequence which converges a.s.
|If <math>\lim_{n\to\infty\atop\p}X_n=X</math>, then any of its subsequences also converge in probability. We already know that there exists a subsequence which converges a.s. Conversely, if <math>\lim_{n\to\infty\atop\p}X_n=X</math>, then there is  an <math>\epsilon > 0</math>, some <math>n_k\in\N</math> and a <math>\nu > 0</math> such that for all <math>k\geq 1</math> we get
<math display="block">
\p[\vert X_{n_k}-X\vert > \epsilon] > \nu
</math>
and therefore we cannot extract a subsequence from <math>(X_{n_k})_{k\geq 1}</math> which would converge a.s.}}
{{proofcard|Proposition|prop-3|Let <math>(\Omega,\A,\p)</math> be a probability space. Let <math>(X_n)_{n\geq 1}</math> be a sequence of r.v.'s and <math>g:\R\to\R</math> a continuous map. Moreover, assume that <math>\lim_{n\to\infty\atop\p}X_n=X</math>. Then
<math display="block">
\lim_{n\to\infty\atop \p}g(X_n)=g(X).
</math>
|Any subsequence <math>g((X_{n_k})_{k\geq 1})</math> and <math>(X_{n_k})_{k\geq 1}</math> converges in probability. So it follows that there exists a subsequence <math>(X_{m_k})_{k\geq 1}</math> of <math>(X_{n_k})_{k\geq 1}</math> such that
<math display="block">
\lim_{n\to\infty\atop a.s.} X_n=X\text{and}\lim_{k\to\infty\atop a.s.}g(X_{m_k})=g(X)
</math>
because <math>g</math> is continuous. Now with the previous lemma we get that
<math display="block">
\lim_{n\to\infty\atop \p}g(X_n)=g(X).
</math>}}
{{proofcard|Proposition|prop-4|Let <math>(\Omega,\A,\p)</math> be a probability space. Let <math>(X_n)_{n\geq 1}</math> and <math>(Y_n)_{n\geq 1}</math> be sequences of r.v.'s such that <math>\lim_{n\to\infty\atop \p}X_n=X</math> and <math>\lim_{n\to\infty\atop\p} Y_n=Y</math>. Then
<ul style{{=}}"list-style-type:lower-roman"><li><math>\lim_{n\to\infty\atop\p} X_n+Y_n=X+Y</math>
</li>
<li><math>\lim_{n\to\infty\atop \p}X_n\cdot Y_n=X\cdot Y</math>
</li>
</ul>
|We need to show both points.
<ul style{{=}}"list-style-type:lower-roman"><li>Let <math>\epsilon > 0</math>. Then <math>\vert X_n-X\vert\leq \frac{\epsilon}{2}</math> and <math>\vert Y_n-Y\vert\leq \frac{\epsilon}{2}</math> implies that <math>\vert (X_n+Y_n)-(X+Y)\vert\leq \epsilon</math>, and thus we get
<math display="block">
\p[\vert X_n+Y_n-(X+Y)\vert > \epsilon]\leq \p\left[\vert X_n-X\vert > \frac{\epsilon}{2}\right]+\p\left[\vert Y_n-Y\vert > \frac{\epsilon}{2}\right].
</math>
</li>
<li>We apply proposition 8.4 to the continuous map <math>g(X)=X^2</math>. Hence we get
<math display="block">
2X_nY_n=(X_n+Y_n)^2-X_n^2-Y_n^2.
</math>
</li>
</ul>}}
==General references==
{{cite arXiv|last=Moshayedi|first=Nima|year=2020|title=Lectures on Probability Theory|eprint=2010.16280|class=math.PR}}

Latest revision as of 00:53, 8 May 2024

[math] \newcommand{\R}{\mathbb{R}} \newcommand{\A}{\mathcal{A}} \newcommand{\B}{\mathcal{B}} \newcommand{\N}{\mathbb{N}} \newcommand{\C}{\mathbb{C}} \newcommand{\Rbar}{\overline{\mathbb{R}}} \newcommand{\Bbar}{\overline{\mathcal{B}}} \newcommand{\Q}{\mathbb{Q}} \newcommand{\E}{\mathbb{E}} \newcommand{\p}{\mathbb{P}} \newcommand{\one}{\mathds{1}} \newcommand{\0}{\mathcal{O}} \newcommand{\mat}{\textnormal{Mat}} \newcommand{\sign}{\textnormal{sign}} \newcommand{\CP}{\mathcal{P}} \newcommand{\CT}{\mathcal{T}} \newcommand{\CY}{\mathcal{Y}} \newcommand{\F}{\mathcal{F}} \newcommand{\mathds}{\mathbb}[/math]
Proposition

Let [math](\Omega,\A,\p)[/math] be a probability space. Let [math](X_n)_{n\geq 1}[/math] be a sequence of r.v.'s and assume that for all [math]\epsilon \gt 0[/math] we have

[[math]] \sum_{n\geq 1}\p[\vert X_n-X\vert \gt \epsilon] \lt \infty. [[/math]]
Then

[[math]] \lim_{n\to\infty\atop a.s.}X_n=X. [[/math]]


Show Proof

Take [math]\epsilon_k=\frac{1}{k}[/math] for [math]k\in\N[/math] with [math]k\geq 1[/math]. Now with the Borel-Cantelli lemma we get

[[math]] \p\left[\limsup_n\left\{\vert X_n-X\vert \gt \frac{1}{k}\right\}\right]=0, [[/math]]
which implies that [math]\p\left[\bigcup_{k\geq 1}\limsup_n\left\{\vert X_n-X\vert \gt \frac{1}{k}\right\}\right]=0[/math] and hence

[[math]] \p\left[\underbrace{\bigcap_{k\geq 1}\liminf_n\left\{\vert X_n-X\vert\leq \frac{1}{k}\right\}}_{\Omega'}\right]=1. [[/math]]
Moreover, we have that [math]\p[\Omega']=1[/math] and for [math]\omega\in\Omega'[/math] we get that for all [math]k\geq 1[/math] there is [math]n_0(\omega)\in\N\setminus\{0\}[/math] such that for [math]n\geq n_0(\omega)[/math] we get that [math]\vert X_n(\omega)-X(\omega)\vert\leq \frac{1}{k}[/math], i.e. [math]\lim_{n\to\infty}X_n(\omega)=X(\omega)[/math] for [math]\omega\in\Omega'[/math].

Example

Let [math](\Omega,\A,\p)[/math] be a probability space. Let [math](X_n)_{n\geq 1}[/math] be a sequence of r.v.'s such that [math]\p[X_n=0]=1-\frac{1}{1+n^2}[/math] and [math]\p[X_n=1]=\frac{1}{1+n^2}[/math]. Then for all [math]\epsilon \gt 0[/math] we get [math]\p[\vert X_n\vert \gt \epsilon]=\p[X_n \gt \epsilon]=\frac{1}{1+n^2}[/math], so it follows

[[math]] \sum_{n\geq 1}\p[\vert X_n\vert \gt \epsilon] \lt \infty, [[/math]]

which implies that [math]\lim_{n\to\infty\atop a.s.}X_n=0.[/math]

Proposition

Let [math](\Omega,\A,\p)[/math] be a probability space. Let [math](X_n)_{n\geq 1}[/math] be a sequence of r.v.'s. Then

[[math]] \lim_{n\to\infty\atop a.s.}X_n=X\Longleftrightarrow \lim_{n\to\infty\atop \p}\sup_{m \gt n}\vert X_m-X\vert=0. [[/math]]


Show Proof

Exercise.

Example

Let [math](Y_n)_{n\geq 1}[/math] be iid r.v.'s such that [math]\p[Y_n\leq X]=1-\frac{1}{1+X}[/math] for [math]X\geq 0[/math] and [math]n\geq 1[/math]. Take [math]X_n=\frac{Y_n}{n}[/math] and let [math]\epsilon \gt 0[/math]. Then

[[math]] \p[\vert X_n\vert \gt \epsilon]=\p[\vert Y_n\vert \gt n\epsilon]=\frac{1}{1+n\epsilon}\xrightarrow{n\to\infty}0, [[/math]]

and thus [math]\lim_{n\to\infty\atop \p}X_n=0[/math]. Moreover, we have

[[math]] \p\left[\sup_{m\geq n}\vert X_m\vert \gt \epsilon\right]=1-\p\left[\sup_{m\geq n}\vert X_n\vert\leq \epsilon\right]=1-\prod_{m\geq n}^\infty\left(1-\frac{1}{1+m\epsilon}\right), [[/math]]

but [math]\prod_{m\geq n}^\infty\left(1-\frac{1}{1+m\epsilon}\right)=0[/math]. Hence [math]\p[\sup_{m\geq n}\vert X_n\vert \gt \epsilon]\not\rightarrow 0[/math] as [math]n\to\infty[/math] and therefore [math](X_n)_{n\geq 1}[/math] doesn't converge a.s. to [math]X[/math].

Lemma

Let [math](\Omega,\A,\p)[/math] be a probability space. Let [math](X_n)_{n\geq 1}[/math] be a sequence of r.v.'s. Then [math]\lim_{n\to\infty\atop \p}X_n=X[/math] if and only if for very subsequence of [math](X_n)_{n\geq 1}[/math], there exists a further subsequence which converges a.s.


Show Proof

If [math]\lim_{n\to\infty\atop\p}X_n=X[/math], then any of its subsequences also converge in probability. We already know that there exists a subsequence which converges a.s. Conversely, if [math]\lim_{n\to\infty\atop\p}X_n=X[/math], then there is an [math]\epsilon \gt 0[/math], some [math]n_k\in\N[/math] and a [math]\nu \gt 0[/math] such that for all [math]k\geq 1[/math] we get

[[math]] \p[\vert X_{n_k}-X\vert \gt \epsilon] \gt \nu [[/math]]

and therefore we cannot extract a subsequence from [math](X_{n_k})_{k\geq 1}[/math] which would converge a.s.

Proposition

Let [math](\Omega,\A,\p)[/math] be a probability space. Let [math](X_n)_{n\geq 1}[/math] be a sequence of r.v.'s and [math]g:\R\to\R[/math] a continuous map. Moreover, assume that [math]\lim_{n\to\infty\atop\p}X_n=X[/math]. Then

[[math]] \lim_{n\to\infty\atop \p}g(X_n)=g(X). [[/math]]


Show Proof

Any subsequence [math]g((X_{n_k})_{k\geq 1})[/math] and [math](X_{n_k})_{k\geq 1}[/math] converges in probability. So it follows that there exists a subsequence [math](X_{m_k})_{k\geq 1}[/math] of [math](X_{n_k})_{k\geq 1}[/math] such that

[[math]] \lim_{n\to\infty\atop a.s.} X_n=X\text{and}\lim_{k\to\infty\atop a.s.}g(X_{m_k})=g(X) [[/math]]

because [math]g[/math] is continuous. Now with the previous lemma we get that

[[math]] \lim_{n\to\infty\atop \p}g(X_n)=g(X). [[/math]]

Proposition

Let [math](\Omega,\A,\p)[/math] be a probability space. Let [math](X_n)_{n\geq 1}[/math] and [math](Y_n)_{n\geq 1}[/math] be sequences of r.v.'s such that [math]\lim_{n\to\infty\atop \p}X_n=X[/math] and [math]\lim_{n\to\infty\atop\p} Y_n=Y[/math]. Then

  • [math]\lim_{n\to\infty\atop\p} X_n+Y_n=X+Y[/math]
  • [math]\lim_{n\to\infty\atop \p}X_n\cdot Y_n=X\cdot Y[/math]


Show Proof

We need to show both points.

  • Let [math]\epsilon \gt 0[/math]. Then [math]\vert X_n-X\vert\leq \frac{\epsilon}{2}[/math] and [math]\vert Y_n-Y\vert\leq \frac{\epsilon}{2}[/math] implies that [math]\vert (X_n+Y_n)-(X+Y)\vert\leq \epsilon[/math], and thus we get
    [[math]] \p[\vert X_n+Y_n-(X+Y)\vert \gt \epsilon]\leq \p\left[\vert X_n-X\vert \gt \frac{\epsilon}{2}\right]+\p\left[\vert Y_n-Y\vert \gt \frac{\epsilon}{2}\right]. [[/math]]
  • We apply proposition 8.4 to the continuous map [math]g(X)=X^2[/math]. Hence we get
    [[math]] 2X_nY_n=(X_n+Y_n)^2-X_n^2-Y_n^2. [[/math]]

General references

Moshayedi, Nima (2020). "Lectures on Probability Theory". arXiv:2010.16280 [math.PR].