Revision as of 01:53, 8 May 2024 by Bot

Submartingales and Supermartingales

[math] \newcommand{\R}{\mathbb{R}} \newcommand{\A}{\mathcal{A}} \newcommand{\B}{\mathcal{B}} \newcommand{\N}{\mathbb{N}} \newcommand{\C}{\mathbb{C}} \newcommand{\Rbar}{\overline{\mathbb{R}}} \newcommand{\Bbar}{\overline{\mathcal{B}}} \newcommand{\Q}{\mathbb{Q}} \newcommand{\E}{\mathbb{E}} \newcommand{\p}{\mathbb{P}} \newcommand{\one}{\mathds{1}} \newcommand{\0}{\mathcal{O}} \newcommand{\mat}{\textnormal{Mat}} \newcommand{\sign}{\textnormal{sign}} \newcommand{\CP}{\mathcal{P}} \newcommand{\CT}{\mathcal{T}} \newcommand{\CY}{\mathcal{Y}} \newcommand{\F}{\mathcal{F}} \newcommand{\mathds}{\mathbb}[/math]
Definition (Submartingale and Supermartingale)

Let [math](\Omega,\F,(\F_n)_{n\geq0},\p)[/math] be a filtered probability space. A stochastic process [math](X_n)_{n\geq0}[/math] is called a submartingale (resp. supermartingale) if

  • [math]\E[\vert X_n\vert] \lt \infty[/math] for all [math]n\geq 0[/math]
  • [math](X_n)_{n\geq 0}[/math] is [math]\F_n[/math]-adapted.
  • [math]\E[X_n\mid \F_m]\geq X_m[/math] a.s. for all [math]m\leq n[/math] (resp. [math]\E[X_n\mid \F_m]\leq X_m[/math] a.s. for all [math]m\leq n[/math])

A stochastic process [math](X_n)_{n\geq 0}[/math] is a martingale if and only if it is a submartingale and a supermartingale. A martingale is in particular a submartingale and a supermartingale. If [math](X_n)_{n\geq0}[/math] is a submartingale, then the map [math]n\mapsto \E[X_n][/math] is increasing. If [math](X_n)_{n\geq 0}[/math] is a supermartingale, then the map [math]n\mapsto \E[X_n][/math] is decreasing.

Example


Let [math](\Omega,\F,(\F_n)_{n\geq0},\p)[/math] be a filtered probability space. Let [math]S_n=\sum_{j=1}^{n}Y_j[/math], where [math](Y_n)_{n\geq1}[/math] is a sequence of iid r.v.'s. Moreover, let [math]S_0=0[/math], [math]\F_0=\{\varnothing,\Omega\}[/math] and [math]\F_n=\sigma(Y_1,...,Y_n)[/math]. Then we get

[[math]] \E[S_{n+1}\mid\F_n]=S_n+\E[Y_{n+1}]. [[/math]]

If [math]\E[Y_{n+1}] \gt 0[/math], then [math]\E[S_{n+1}\mid\F_n]\geq S_n[/math] and thus [math](S_n)_{n\geq 0}[/math] is a submartingale. On the other hand, if [math]\E[Y_{n+1}] \lt 0[/math], then [math]\E[S_{n+1}\mid\F_n]\leq S_n[/math] and thus [math](S_n)_{n\geq 0}[/math] is a supermartingale.

Proposition

Let [math](\Omega,\F,(\F_n)_{n\geq0},\p)[/math] be a filtered probability space. If [math](M_n)_{n\geq 0}[/math] is a martingale and [math]\varphi[/math] is a convex function such that [math]\varphi(M_n)\in L^1(\Omega,\F,(\F_n)_{n\geq 0},\p)[/math] for all [math]n\geq0[/math], then

[[math]] (\varphi(M_n))_{n\geq 0} [[/math]]
is a submartingale.


Show Proof

The first two conditions for a martingale are clearly satisfied. Now for [math]m\leq n[/math], we get

[[math]] \E[M_n\mid \F_m]=M_ma.s., [[/math]]
since [math](M_n)_{n\geq 0}[/math] is assumed to be a martingale. Hence, with Jensen's inequality, we get

[[math]] \varphi(\E[M_n\mid\F_m])=\varphi(M_m)\leq \E[\varphi(M_n)\mid\F_m]a.s. [[/math]]

Corollary

Let [math](\Omega,\F,(\F_n)_{n\geq0},\p)[/math] be a filtered probability space. If [math](M_n)_{n\geq 0}[/math] is a martingale, then

  • [math](\vert M_n\vert)_{n\geq0}[/math] and [math](M^+_n)_{n\geq 0}[/math] are submartingales.
  • if for all [math]n\geq 0[/math], [math]\E[M_n^2] \lt \infty[/math], then [math](M_n^2)_{n\geq 0}[/math] is a submartingale.

Theorem

Let [math](\Omega,\F,(\F_n)_{n\geq0},\p)[/math] be a filtered probability space. Let [math](X_n)_{n\geq 0}[/math] be a submartingale and let [math]T[/math] be a stopping time bounded by [math]C\in\N[/math]. Then

[[math]] \E[X_T]\leq \E[X_C]. [[/math]]


Show Proof

Exercise[a]

Theorem (Doob's decomposition)

Let [math](\Omega,\F,(\F_n)_{n\geq0},\p)[/math] be a filtered probability space. Let [math](X_n)_{n\geq 0}[/math] be a submartingale. Then there exists a martingale [math]M=(M_n)_{n\geq 0}[/math] with [math]M_0=0[/math] and a sequence [math]A=(A_n)_{n\geq 0}[/math], such that [math]A_{n+1}\geq A_n[/math] a.s. with [math]A_0=0[/math] a.s., which is called an increasing process, and with [math]A_{n+1}[/math] being [math]\F_n[/math]-measurable, which we will call predictable, such that

[[math]] X_n=X_0+M_0+A_n. [[/math]]

Moreover, this decomposition is a.s. unique.


Show Proof

Let us define [math]A_0=0[/math] and for [math]n\geq 1[/math]

[[math]] A_n=\sum_{k=1}^n\E[X_k-X_{k-1}\mid\F_{k-1}]. [[/math]]
Since [math](X_n)_{n\geq 0}[/math] is a submartingale, we get

[[math]] \E[X_k-X_{k-1}\mid\F_{k-1}]\geq 0 [[/math]]
and hence [math]A_{n+1}-A_n\geq 0[/math]. Therefore [math](A_n)_{n\geq 0}[/math] is an increasing process. Moreover, from the definition of the conditional expectation, [math]A_n[/math] is [math]\F_{n-1}[/math]-measurable for [math]n\geq 1[/math]. Thus [math]A_n[/math] is predictable as well. We also note that

[[math]] \E[X_n\mid \F_{n-1}]-X_{n-1}=\E[X_n-X_{n-1}\mid \F_{n-1}]=A_n-A_{n-1}. [[/math]]
Hence we get

[[math]] \underbrace{\E[X_n\mid\F_{n-1}]}_{\E[X_n-A_n\mid \F_{n-1}]}-A_n=X_{n-1}-A_{n-1}. [[/math]]
If we set [math]M_n=X_n-A_n-X_0[/math], it follows that [math]M=(M_n)_{n\geq 0}[/math] is a martingale with [math]M_0=0[/math]. This proves the existence part. For uniqueness, we note that if we have two such decompositions

[[math]] X_n=X_0+M_n+A_n=X_0+L_n+C_n, [[/math]]
where [math]L_n[/math] denotes the martingale part and [math]C_n[/math] the increasing process part, it follows that

[[math]] L_n-M_n=A_n-C_n. [[/math]]
Now since [math]A_n-C_n[/math] is [math]\F_{n-1}[/math]-measurable, we get that [math]L_n-M_n[/math] is also [math]\F_{n-1}[/math]-measurable. Thus

[[math]] L_n-M_n=\E[L_n-M_n\mid \F_{n-1}]=L_{n-1}-M_{n-1}, [[/math]]
because of the martingale property. By induction, we have a chain of equalities

[[math]] L_n-M_n=L_{n-1}-M_{n-1}=\dotsm =L_0-M_0=0. [[/math]]
Therefore [math]L_n=M_n[/math] and also [math]A_n=C_n[/math].

Corollary

Let [math](\Omega,\F,(\F_n)_{n\geq0},\p)[/math] be a filtered probability space. Let [math]X=(X_n)_{n\geq 0}[/math] be a supermartingale. Then there exists a.s. a unique decomposition

[[math]] X_n=X_0+M_n-A_n, [[/math]]
where [math]M=(M_n)_{n\geq 0}[/math] is a martingale with [math]M_0=0[/math] and [math]A=(A_n)_{n\geq 0}[/math] is a increasing process with [math]A_0=0[/math].


Show Proof

Let [math]Y_n=-X_n[/math] for all [math]n\geq 0[/math]. Then the stochastic process obtained by [math](Y_n)_{n\geq0}[/math] is a submartingale. Theorem 8.4. tells us that there exists a unique decomposition

[[math]] Y_n=Y_0+L_n+C_n, [[/math]]
where [math]L_n[/math] denotes the martingale part and [math]C_n[/math] the increasing process part. Hence we get

[[math]] X_n=X_0-L_n-C_n [[/math]]
and if we take [math]M_n=-L_n[/math] and [math]A_n=C_n[/math], the claim follows.

Now consider a stopped process. Let [math](\Omega,\F,(\F_n)_{n\geq0},\p)[/math] be a filtered probability space. Let [math]T[/math] be a stopping time and let [math](X_n)_{n\geq 0}[/math] be a stochastic process. We denote by [math]X^T=(X^T_n)_{n\geq 0}[/math] the process [math](X_{n\land T})_{n\geq 0}[/math].

Proposition

Let [math](\Omega,\F,(\F_n)_{n\geq0},\p)[/math] be a filtered probability space. Let [math](X_n)_{n\geq 0}[/math] be a martingale (resp. sub- or supermartingale) and let [math]T[/math] be a stopping time. Then [math](X_{n\land T})_{n\geq 0}[/math] is also a martingale (resp. sub- or supermartingale).


Show Proof

Note that

[[math]] \{T\geq n+1\}=\{T\leq n\}^C\in\F_n. [[/math]]
Hence we have

[[math]] \E[X_{n+1\land T}-X_{n\land T}\mid \F_n]=\E[(X_{n+1\land T}-X_{n\land T})\one_{\{T\geq n+1\}}\mid\F_n]=\one_{\{ T\geq n+1\}}\E[X_{n+1}-X_n\mid \F_n]. [[/math]]
If [math](X_n)_{n\geq 0}[/math] is a martingale, we deduce that

[[math]] \E[X_{n+1\land T}-X_{n\land T}\mid\F_n]=0. [[/math]]
Moreover, [math]X_{n\land T}[/math] is [math]\F_n[/math]-measurable. Therefore

[[math]] \E[X_{n+1\land T}\mid \F_n]=X_{n\land T}. [[/math]]
The same holds for sub-and super martingales.

Theorem

Let [math](\Omega,\F,(\F_n)_{n\geq0},\p)[/math] be a filtered probability space. Let [math](X_n)_{n\geq 0}[/math] be a submartingale (resp. supermartingale) and let [math]S[/math] and [math]T[/math] be two bounded stopping times, such that [math]S\leq T[/math] a.s. Then

[[math]] \E[X_T\mid\F_S]\geq X_Sa.s.(resp. \E[X_T\mid \F_S]\leq X_Sa.s.) [[/math]]


Show Proof

Let us assume that [math](X_n)_{n\geq 0}[/math] is a supermartingale. Let [math]A\in\F_S[/math] such that [math]S\leq T\leq C\in\N[/math]. We already know that [math](X_{n\land T})_{n\geq 0}[/math] is a supermartingale. Therefore we get

[[math]] \begin{align*} \E[X_T\one_A]&=\sum_{j=0}^C\E[X_{\underbrace{C\land T}_{T}}\one_A\one_{\{S=j\}}]=\sum_{j=0}^C\E[X_{C\land T}\one_{\underbrace{A\cap\{S=j\}}_{\in\F_j}}]\\ &\leq \sum_{j=0}^C\E[X_{j\land T}\one_{A\cap \{S=j\}}]=\sum_{j=0}^C\E[X_j\one_A\one_{\{S=j\}}]\\ &=\E\left[\sum_{j=0}^CX_j\one_{\{S=j\}}\one_A\right]=\E[X_S\one_A]=\E[X_T\mid\F_S]\leq X_S. \end{align*} [[/math]]

Corollary

Let [math](\Omega,\F,(\F_n)_{n\geq0},\p)[/math] be a filtered probability space. Let [math](X_n)_{n\geq 0}[/math] be a submartingale (resp. supermartingale) and let [math]T[/math] be a bounded stopping time. Then

[[math]] \E[X_T]\geq \E[X_0](resp.\E[X_T]\leq \E[X_0]). [[/math]]
Moreover, if [math]S\leq T[/math], for [math]S[/math] and [math]T[/math] two bounded stopping times, we have

[[math]] \E[X_T]\geq \E[X_S](resp.\E[X_T]\leq \E[X_S]). [[/math]]

\begin{exer} Let [math](\Omega,\F,(\F_n)_{n\geq0},\p)[/math] be a filtered probability space. Let [math]X=(X_n)_{n\geq 0}[/math] be a supermartingale and let [math]T[/math] be a stopping time. Then

[[math]] X_T\in L^1(\Omega,\F,(\F_n)_{n\geq 0},\p) [[/math]]

and

[[math]] \E[X_T]\leq \E[X_0] [[/math]]

in each case of the following situations.

  • [math]T[/math] is bounded.
  • [math]X[/math] is bounded and [math]T[/math] is finite.
  • [math]\E[T] \lt \infty[/math] and for some [math]k\geq 0[/math], we have
    [[math]] \vert X_n(\omega)-X_{n-1}(\omega)\vert\leq k, [[/math]]
    for all [math]\omega\in\Omega[/math].

\end{exer}

General references

Moshayedi, Nima (2020). "Lectures on Probability Theory". arXiv:2010.16280 [math.PR].

Notes

  1. The proof is the same as in Theorem 7.7.