guide:D50f15a52d: Difference between revisions

From Stochiki
No edit summary
 
No edit summary
 
Line 1: Line 1:
<div class="d-none"><math>
\newcommand{\R}{\mathbb{R}}
\newcommand{\A}{\mathcal{A}}
\newcommand{\B}{\mathcal{B}}
\newcommand{\N}{\mathbb{N}}
\newcommand{\C}{\mathbb{C}}
\newcommand{\Rbar}{\overline{\mathbb{R}}}
\newcommand{\Bbar}{\overline{\mathcal{B}}}
\newcommand{\Q}{\mathbb{Q}}
\newcommand{\E}{\mathbb{E}}
\newcommand{\p}{\mathbb{P}}
\newcommand{\one}{\mathds{1}}
\newcommand{\0}{\mathcal{O}}
\newcommand{\mat}{\textnormal{Mat}}
\newcommand{\sign}{\textnormal{sign}}
\newcommand{\CP}{\mathcal{P}}
\newcommand{\CT}{\mathcal{T}}
\newcommand{\CY}{\mathcal{Y}}
\newcommand{\F}{\mathcal{F}}
\newcommand{\mathds}{\mathbb}</math></div>
Let <math>(\Omega,\F,(\F_n)_{n\geq0},\p)</math> be a filtered probability space. We start with a useful remark. If <math>(X_n)_{n\geq 0}</math> is a submartingale, we get in particular that <math>X_n\in L^1(\Omega,\F,(\F_n)_{n\geq0},\p)</math> for all <math>n\geq 0</math>. Moreover, we know that we can write


<math display="block">
\E[X_n]=\E[X_n^+]-\E[X_n^-]
</math>
and hence
<math display="block">
\E[X_n^-]=\E[X_n^+]-\E[X_n].
</math>
The submartingale property implies that <math>\E[X_0]\leq  \E[X_n]</math> and thus
<math display="block">
\E[X_n^-]\leq  \E[X_n^+]-\E[X_0].
</math>
Therefore, if <math>\sup_{n\geq 0}\E[X_n^+] < \infty</math>, then <math>\E[X_n^-]\leq  \sup_{n\geq 0}\E[X_n^+]-\E[\vert X_0\vert] < \infty</math>. Since <math>\vert X_n\vert =X_n^++X_n^-</math>, we have that <math>\sup_{n\geq 0}\E[X_n^+] < \infty</math> if and only if <math>\sup_{n\geq 0}\E[\vert X_n\vert] < \infty</math>.
{{proofcard|Lemma (Doob's upcrossing inequality)|lem1|Let <math>(\Omega,\F,(\F_n)_{n\geq0},\p)</math> be a filtered probability space. Let <math>X=(X_n)_{n\geq 0}</math> be a supermartingale and <math>a < b</math> two real numbers. Then for all <math>n\geq 0</math> we get
<math display="block">
(b-a)\E[N_n([a,b],X)]\leq  \E[(X_n-a)^-],
</math>
where <math>N_n([a,b],x)=\sup\{ k\geq 0\mid T_k(x)\leq  n\}</math>, i.e. the number of uncrossings of the interval <math>[a,b]</math> by the sequence <math>x=(x_n)_n</math> by time <math>n</math>, and <math>(T_k)_{k\geq 0}</math> is a sequence of stoppping times. Moreover, as <math>n\to\infty</math> we have
<math display="block">
N_n([a,b],x)\uparrow N([a,b],x)=\sup\{ k\geq 0\mid T_k(x) < \infty\},
</math>
i.e., the total number of up crossings of the interval <math>[a,b]</math>.|}}
{{proofcard|Lemma|lem4|A sequence of real numbers <math>x=(x_n)_n</math> converges in <math>\bar \R=\R\cup\{\pm\infty\}</math> if and only if <math>N([a,b],x) < \infty</math> for all rationals <math>a < b</math>.
|Suppose that <math>x</math> converges. Then if for some <math>a < b</math> we had that <math>N([a,b],x)=\infty</math>, that would imply that <math>\liminf_n x_n\leq  a < b\leq  \limsup_nx_n</math>, which is a contradiction. Next suppose that <math>x</math> does not converge. Then <math>\liminf_nx_n < \limsup_nx_n</math> and so taking <math>a < b</math> rationals between these two numbers gives that <math>N([a,b],x)=\infty</math>.}}
\begin{proof}[Proof of [[#lem1 |Lemma]]]
We will omit the dependence on <math>X</math> from <math>T_k</math> and <math>S_k</math> and we will write <math>N=N_n([a,b],X)</math> to simplify notation. By the definition of the times <math>(T_k)_{k\geq 0}</math> and <math>(S_k)_{k\geq 0}</math>, it is clear that for all <math>k</math>
<math display="block">
\begin{equation}
X_{T_k}-X_{S_k}\geq b-a.
\end{equation}
</math>
We have
<math display="block">
\begin{align*}
\sum_{k=1}^n(X_{T_k\land n}-X_{S_k\land n})&=\sum_{k=1}^N(X_{T_k}-X_{S_k})+\sum_{k=N+1}^n(X_n-X_{S_k\land n})\one_{\{N < n\}}\\
&=\sum_{k=1}^N(X_{T_k}-X_{S_k})+(X_n-X_{S_{N+1}})\one_{\{S_{N+1}\leq  n\}},
\end{align*}
</math>
since the only term contributing in the second sum appearing in the middle of the last equation chain is <math>N+1</math>, by the definition of <math>N</math>. Indeed, if <math>S_{N+2}\leq  n</math>, then that would imply that <math>T_{N+1}\geq n</math>, which would contradict the definition of <math>N</math>. using induction on <math>k\geq 0</math>, it is easy to see that <math>(T_k)_{k\geq 0}</math> and <math>(S_k)_{k\geq 0}</math> are stopping times. Hence for all <math>n\geq 0</math>, we have that <math>S_k\land n\leq  T_k\land n</math> are bounded stopping times and thus we get that <math>\E[X_{S_k\land n}]\geq \E[X_{T_k\land n}]</math>, for all <math>k\geq 0</math>. Therefore, taking expectations in the equations above and using the inequality (9) we get
<math display="block">
0\geq \E\left[\sum_{k=1}^n(X_{T_k\land n}-X_{S_k\land n})\right]\geq (b-a)\E[N]-\E[(X_n-a)^-],
</math>
since <math>(X_n-X_{S_{N+1}})\one_{\{S_{N+1}\leq  n\}}\geq -(X_n-a)^-</math>. Rearranging gives the desired inequality.
\end{proof}
{{proofcard|Theorem (Almost sure martingale convergence theorem)|thm-1|Let <math>(\Omega,\F,(\F_n)_{n\geq0},\p)</math> be a filtered probability space. Let <math>X=(X_n)_{n\geq 0}</math> be a submartingale such that <math>\sup_{n\geq 0}\E[\vert X_n\vert] < \infty</math>. Then the sequence <math>(X_n)_{n\geq 0}</math> converges a.s. to a r.v. <math>X_\infty\in L^1(\Omega,\F_\infty,(\F_n)_{n\geq 0},\p)</math> as <math>n\to\infty</math>, where <math>\F_\infty=\sigma\left(\bigcup_{n\geq 0}\F_n\right)</math>.
|Let <math>a < b\in \Q</math>. By Doob's upcrossing inequality, we get that
<math display="block">
\E[N_n([a,b],X)]\leq  (b-a)^{-1}\E[(X_n-a)^-]\leq  (b-a)^{-1}\E[\vert X_n\vert +a].
</math>
By monotone convergence, since <math>N_n([a,b],X)\uparrow N([a,b],X)</math> as <math>n\to\infty</math>, we get that
<math display="block">
\E[N([a,b],X)]\leq  (b-a)^{-1}\left(\sup_n\E[\vert X_n\vert]+a\right) < \infty,
</math>
by the assumption on <math>X</math> being bounded in <math>L^1(\Omega,\F,(\F_n)_{n\geq 0},\p)</math>. Therefore, we get that <math>N([a,b],X) < \infty</math> a.s. for every <math>a < b\in \Q</math>. Hence,
<math display="block">
\p\left[\bigcap_{a < b\in \Q}\{ N([a,b),X) < \infty\}\right]=1.
</math>
Writing <math>\Omega_0=\bigcap_{a < b\in\Q}\{ N([a,b),X) < \infty\}</math>, we have that <math>\p[\Omega_0]=1</math> and by [[#lem4 |lemma]] on <math>\Omega_0</math> we have that <math>X</math> converges to a possible infinite limit <math>X_\infty</math>. So we can define
<math display="block">
X_\infty=\begin{cases}\lim_{n\to\infty}X_n,& \text{ on $\Omega_0$,}\\ 0,&\text{ on $\Omega\setminus\Omega_0$,}\end{cases}
</math>
Then <math>X_\infty</math> is <math>\F_\infty</math>-measurable and by Fatou and the assumption on <math>X</math> being in <math>L^1(\Omega,\F,(\F_n)_{n\geq 0},\p)</math> we get
<math display="block">
\E[\vert X_\infty\vert]=\E\left[\liminf_{n\to\infty} \vert X_n\vert\right]\leq  \liminf_{n\to\infty}\E[\vert X_n\vert] < \infty.
</math>
Hence <math>X_\infty\in L^1(\Omega,\F_\infty,(\F_n)_{n\geq 0},\p)</math>.}}
{{proofcard|Corollary|cor-1|Let <math>(\Omega,\F,(\F_n)_{n\geq0},\p)</math> be a filtered probability space. Let <math>(X_n)_{n\geq 0}</math> be a nonnegative  supermartingale. Then <math>(X_n)_{n\geq 0}</math> converges a.s. to a limit <math>X_\infty\in L^1(\Omega,\F_\infty,(\F_n)_{\infty},\p)</math> and which satisfies
<math display="block">
X_n\geq \E[X_\infty\mid \F_n]a.s.
</math>
|Note that <math>(-X_n)_{n\geq 0}</math> is a submartingale, thus <math>(-X_n)^+=0</math> for all <math>n\geq 0</math>, which implies that <math>\sup_{n\geq 0}\E[-X_n^+]=0 < \infty</math>. Hence
<math display="block">
X_n\xrightarrow{n\to\infty\atop \text{a.s. and $L^1$}}X_\infty\in L^1(\Omega,\F_\infty,(\F_n)_{n\geq 0},\p).
</math>
Moreover, for all <math>m\geq n</math> we have
<math display="block">
X_n\geq \E[X_m\mid\F_n].
</math>
By Fatou we get
<math display="block">
X_n\geq \liminf_{m\to\infty}\E[X_m\mid \F_n]\geq \E\left[\liminf_{m\to\infty}X_m\mid \F_n\right]=\E[X_\infty\mid \F_n].
</math>}}
'''Example'''
[Simple random walk on <math>\mathbb{Z}</math>]
Let <math>Y_n=1+Z_1+\dotsm +Z_n</math>, for <math>Z_j</math> iid with <math>\p\left[Z_j=\pm1\right]=\frac{1}{2}</math>, <math>Y_0=1</math>, <math>\F_0=\{\varnothing,\Omega\}</math> and <math>\F_n=\sigma(Z_1,...,Z_n)</math>. Then we have already seen that that <math>(Y_n)_{n\geq 0}</math> is a martingale. Let <math>T=\inf\{n\geq 0\mid Y_n=0\}</math>. We need to show that <math>T < \infty</math> a.s. Let <math>X_n=Y_{n\land T}</math>. Then <math>(X_n)_{n\geq 0}</math> is also a martingale. Moreover, <math>X_n\geq 0</math> for all <math>n\geq 0</math> and <math>(X_n)_{n\geq 0}</math> converges a.s. to a r.v. <math>X_\infty\in L^1(\Omega,\F_\infty,(\F_n)_{n\geq 0},\p)</math>. Since <math>X_n=Y_{n\land T}</math>, we get <math>X_\infty=Y_T</math>. The convergence of <math>(X_n)_{n\geq 0}</math> implies that <math>T < \infty</math> a.s., indeed on the set <math>\{T=\infty\}</math> we get <math>\vert X_{n+1}-X_n\vert=1</math>. Consequently, on <math>\{T=\infty\}</math>, we get that <math>(X_n)_{n\geq 0}</math> is not a Cauchy sequence and therefore cannot converge. This implies that <math>\p[T=\infty]=0</math> and thus <math>T < \infty</math>. Hence
<math display="block">
\lim_{n\to\infty}X_n=Y_T=0.
</math>
We also note that <math>\E[X_n]=1 > \E[X_\infty]=0</math> for all <math>n\geq 0</math> and so <math>X_n</math> does not converge to <math>X_\infty</math> in <math>L^1</math>.
==General references==
{{cite arXiv|last=Moshayedi|first=Nima|year=2020|title=Lectures on Probability Theory|eprint=2010.16280|class=math.PR}}

Latest revision as of 01:53, 8 May 2024

[math] \newcommand{\R}{\mathbb{R}} \newcommand{\A}{\mathcal{A}} \newcommand{\B}{\mathcal{B}} \newcommand{\N}{\mathbb{N}} \newcommand{\C}{\mathbb{C}} \newcommand{\Rbar}{\overline{\mathbb{R}}} \newcommand{\Bbar}{\overline{\mathcal{B}}} \newcommand{\Q}{\mathbb{Q}} \newcommand{\E}{\mathbb{E}} \newcommand{\p}{\mathbb{P}} \newcommand{\one}{\mathds{1}} \newcommand{\0}{\mathcal{O}} \newcommand{\mat}{\textnormal{Mat}} \newcommand{\sign}{\textnormal{sign}} \newcommand{\CP}{\mathcal{P}} \newcommand{\CT}{\mathcal{T}} \newcommand{\CY}{\mathcal{Y}} \newcommand{\F}{\mathcal{F}} \newcommand{\mathds}{\mathbb}[/math]

Let [math](\Omega,\F,(\F_n)_{n\geq0},\p)[/math] be a filtered probability space. We start with a useful remark. If [math](X_n)_{n\geq 0}[/math] is a submartingale, we get in particular that [math]X_n\in L^1(\Omega,\F,(\F_n)_{n\geq0},\p)[/math] for all [math]n\geq 0[/math]. Moreover, we know that we can write

[[math]] \E[X_n]=\E[X_n^+]-\E[X_n^-] [[/math]]

and hence

[[math]] \E[X_n^-]=\E[X_n^+]-\E[X_n]. [[/math]]

The submartingale property implies that [math]\E[X_0]\leq \E[X_n][/math] and thus

[[math]] \E[X_n^-]\leq \E[X_n^+]-\E[X_0]. [[/math]]

Therefore, if [math]\sup_{n\geq 0}\E[X_n^+] \lt \infty[/math], then [math]\E[X_n^-]\leq \sup_{n\geq 0}\E[X_n^+]-\E[\vert X_0\vert] \lt \infty[/math]. Since [math]\vert X_n\vert =X_n^++X_n^-[/math], we have that [math]\sup_{n\geq 0}\E[X_n^+] \lt \infty[/math] if and only if [math]\sup_{n\geq 0}\E[\vert X_n\vert] \lt \infty[/math].

Lemma (Doob's upcrossing inequality)

Let [math](\Omega,\F,(\F_n)_{n\geq0},\p)[/math] be a filtered probability space. Let [math]X=(X_n)_{n\geq 0}[/math] be a supermartingale and [math]a \lt b[/math] two real numbers. Then for all [math]n\geq 0[/math] we get

[[math]] (b-a)\E[N_n([a,b],X)]\leq \E[(X_n-a)^-], [[/math]]
where [math]N_n([a,b],x)=\sup\{ k\geq 0\mid T_k(x)\leq n\}[/math], i.e. the number of uncrossings of the interval [math][a,b][/math] by the sequence [math]x=(x_n)_n[/math] by time [math]n[/math], and [math](T_k)_{k\geq 0}[/math] is a sequence of stoppping times. Moreover, as [math]n\to\infty[/math] we have

[[math]] N_n([a,b],x)\uparrow N([a,b],x)=\sup\{ k\geq 0\mid T_k(x) \lt \infty\}, [[/math]]
i.e., the total number of up crossings of the interval [math][a,b][/math].

Lemma

A sequence of real numbers [math]x=(x_n)_n[/math] converges in [math]\bar \R=\R\cup\{\pm\infty\}[/math] if and only if [math]N([a,b],x) \lt \infty[/math] for all rationals [math]a \lt b[/math].


Show Proof

Suppose that [math]x[/math] converges. Then if for some [math]a \lt b[/math] we had that [math]N([a,b],x)=\infty[/math], that would imply that [math]\liminf_n x_n\leq a \lt b\leq \limsup_nx_n[/math], which is a contradiction. Next suppose that [math]x[/math] does not converge. Then [math]\liminf_nx_n \lt \limsup_nx_n[/math] and so taking [math]a \lt b[/math] rationals between these two numbers gives that [math]N([a,b],x)=\infty[/math].

\begin{proof}[Proof of Lemma] We will omit the dependence on [math]X[/math] from [math]T_k[/math] and [math]S_k[/math] and we will write [math]N=N_n([a,b],X)[/math] to simplify notation. By the definition of the times [math](T_k)_{k\geq 0}[/math] and [math](S_k)_{k\geq 0}[/math], it is clear that for all [math]k[/math]

[[math]] \begin{equation} X_{T_k}-X_{S_k}\geq b-a. \end{equation} [[/math]]


We have


[[math]] \begin{align*} \sum_{k=1}^n(X_{T_k\land n}-X_{S_k\land n})&=\sum_{k=1}^N(X_{T_k}-X_{S_k})+\sum_{k=N+1}^n(X_n-X_{S_k\land n})\one_{\{N \lt n\}}\\ &=\sum_{k=1}^N(X_{T_k}-X_{S_k})+(X_n-X_{S_{N+1}})\one_{\{S_{N+1}\leq n\}}, \end{align*} [[/math]]


since the only term contributing in the second sum appearing in the middle of the last equation chain is [math]N+1[/math], by the definition of [math]N[/math]. Indeed, if [math]S_{N+2}\leq n[/math], then that would imply that [math]T_{N+1}\geq n[/math], which would contradict the definition of [math]N[/math]. using induction on [math]k\geq 0[/math], it is easy to see that [math](T_k)_{k\geq 0}[/math] and [math](S_k)_{k\geq 0}[/math] are stopping times. Hence for all [math]n\geq 0[/math], we have that [math]S_k\land n\leq T_k\land n[/math] are bounded stopping times and thus we get that [math]\E[X_{S_k\land n}]\geq \E[X_{T_k\land n}][/math], for all [math]k\geq 0[/math]. Therefore, taking expectations in the equations above and using the inequality (9) we get

[[math]] 0\geq \E\left[\sum_{k=1}^n(X_{T_k\land n}-X_{S_k\land n})\right]\geq (b-a)\E[N]-\E[(X_n-a)^-], [[/math]]

since [math](X_n-X_{S_{N+1}})\one_{\{S_{N+1}\leq n\}}\geq -(X_n-a)^-[/math]. Rearranging gives the desired inequality.

\end{proof}

Theorem (Almost sure martingale convergence theorem)

Let [math](\Omega,\F,(\F_n)_{n\geq0},\p)[/math] be a filtered probability space. Let [math]X=(X_n)_{n\geq 0}[/math] be a submartingale such that [math]\sup_{n\geq 0}\E[\vert X_n\vert] \lt \infty[/math]. Then the sequence [math](X_n)_{n\geq 0}[/math] converges a.s. to a r.v. [math]X_\infty\in L^1(\Omega,\F_\infty,(\F_n)_{n\geq 0},\p)[/math] as [math]n\to\infty[/math], where [math]\F_\infty=\sigma\left(\bigcup_{n\geq 0}\F_n\right)[/math].


Show Proof

Let [math]a \lt b\in \Q[/math]. By Doob's upcrossing inequality, we get that

[[math]] \E[N_n([a,b],X)]\leq (b-a)^{-1}\E[(X_n-a)^-]\leq (b-a)^{-1}\E[\vert X_n\vert +a]. [[/math]]
By monotone convergence, since [math]N_n([a,b],X)\uparrow N([a,b],X)[/math] as [math]n\to\infty[/math], we get that

[[math]] \E[N([a,b],X)]\leq (b-a)^{-1}\left(\sup_n\E[\vert X_n\vert]+a\right) \lt \infty, [[/math]]
by the assumption on [math]X[/math] being bounded in [math]L^1(\Omega,\F,(\F_n)_{n\geq 0},\p)[/math]. Therefore, we get that [math]N([a,b],X) \lt \infty[/math] a.s. for every [math]a \lt b\in \Q[/math]. Hence,

[[math]] \p\left[\bigcap_{a \lt b\in \Q}\{ N([a,b),X) \lt \infty\}\right]=1. [[/math]]
Writing [math]\Omega_0=\bigcap_{a \lt b\in\Q}\{ N([a,b),X) \lt \infty\}[/math], we have that [math]\p[\Omega_0]=1[/math] and by lemma on [math]\Omega_0[/math] we have that [math]X[/math] converges to a possible infinite limit [math]X_\infty[/math]. So we can define

[[math]] X_\infty=\begin{cases}\lim_{n\to\infty}X_n,& \text{ on $\Omega_0$,}\\ 0,&\text{ on $\Omega\setminus\Omega_0$,}\end{cases} [[/math]]
Then [math]X_\infty[/math] is [math]\F_\infty[/math]-measurable and by Fatou and the assumption on [math]X[/math] being in [math]L^1(\Omega,\F,(\F_n)_{n\geq 0},\p)[/math] we get

[[math]] \E[\vert X_\infty\vert]=\E\left[\liminf_{n\to\infty} \vert X_n\vert\right]\leq \liminf_{n\to\infty}\E[\vert X_n\vert] \lt \infty. [[/math]]
Hence [math]X_\infty\in L^1(\Omega,\F_\infty,(\F_n)_{n\geq 0},\p)[/math].

Corollary

Let [math](\Omega,\F,(\F_n)_{n\geq0},\p)[/math] be a filtered probability space. Let [math](X_n)_{n\geq 0}[/math] be a nonnegative supermartingale. Then [math](X_n)_{n\geq 0}[/math] converges a.s. to a limit [math]X_\infty\in L^1(\Omega,\F_\infty,(\F_n)_{\infty},\p)[/math] and which satisfies

[[math]] X_n\geq \E[X_\infty\mid \F_n]a.s. [[/math]]


Show Proof

Note that [math](-X_n)_{n\geq 0}[/math] is a submartingale, thus [math](-X_n)^+=0[/math] for all [math]n\geq 0[/math], which implies that [math]\sup_{n\geq 0}\E[-X_n^+]=0 \lt \infty[/math]. Hence

[[math]] X_n\xrightarrow{n\to\infty\atop \text{a.s. and $L^1$}}X_\infty\in L^1(\Omega,\F_\infty,(\F_n)_{n\geq 0},\p). [[/math]]
Moreover, for all [math]m\geq n[/math] we have

[[math]] X_n\geq \E[X_m\mid\F_n]. [[/math]]
By Fatou we get

[[math]] X_n\geq \liminf_{m\to\infty}\E[X_m\mid \F_n]\geq \E\left[\liminf_{m\to\infty}X_m\mid \F_n\right]=\E[X_\infty\mid \F_n]. [[/math]]


Example

[Simple random walk on [math]\mathbb{Z}[/math]] Let [math]Y_n=1+Z_1+\dotsm +Z_n[/math], for [math]Z_j[/math] iid with [math]\p\left[Z_j=\pm1\right]=\frac{1}{2}[/math], [math]Y_0=1[/math], [math]\F_0=\{\varnothing,\Omega\}[/math] and [math]\F_n=\sigma(Z_1,...,Z_n)[/math]. Then we have already seen that that [math](Y_n)_{n\geq 0}[/math] is a martingale. Let [math]T=\inf\{n\geq 0\mid Y_n=0\}[/math]. We need to show that [math]T \lt \infty[/math] a.s. Let [math]X_n=Y_{n\land T}[/math]. Then [math](X_n)_{n\geq 0}[/math] is also a martingale. Moreover, [math]X_n\geq 0[/math] for all [math]n\geq 0[/math] and [math](X_n)_{n\geq 0}[/math] converges a.s. to a r.v. [math]X_\infty\in L^1(\Omega,\F_\infty,(\F_n)_{n\geq 0},\p)[/math]. Since [math]X_n=Y_{n\land T}[/math], we get [math]X_\infty=Y_T[/math]. The convergence of [math](X_n)_{n\geq 0}[/math] implies that [math]T \lt \infty[/math] a.s., indeed on the set [math]\{T=\infty\}[/math] we get [math]\vert X_{n+1}-X_n\vert=1[/math]. Consequently, on [math]\{T=\infty\}[/math], we get that [math](X_n)_{n\geq 0}[/math] is not a Cauchy sequence and therefore cannot converge. This implies that [math]\p[T=\infty]=0[/math] and thus [math]T \lt \infty[/math]. Hence

[[math]] \lim_{n\to\infty}X_n=Y_T=0. [[/math]]

We also note that [math]\E[X_n]=1 \gt \E[X_\infty]=0[/math] for all [math]n\geq 0[/math] and so [math]X_n[/math] does not converge to [math]X_\infty[/math] in [math]L^1[/math].

General references

Moshayedi, Nima (2020). "Lectures on Probability Theory". arXiv:2010.16280 [math.PR].