guide:D9112ff950: Difference between revisions
No edit summary |
mNo edit summary |
||
Line 132: | Line 132: | ||
[[#thm5 |Theorem]] is also true if one replace <math>\E_x</math> with <math>\E_\mu</math>. | [[#thm5 |Theorem]] is also true if one replace <math>\E_x</math> with <math>\E_\mu</math>. | ||
}} | }} | ||
| | |It is enough to prove the first statement. For this we can restrict ourselves to the case where <math>F=\one_{\{ X_0=x_0,X_1=x_1,...,X_n=x_n\}}</math> for <math>x_0,x_1,...,x_n\in E</math>. Let us first consider the case where <math>G=\one_{\{X_0=y_0,...,X_p=y_p\}}</math>. In this case, if <math>y\in E</math>, then | ||
It is enough to prove the first statement. For this we can restrict ourselves to the case where <math>F=\one_{\{ X_0=x_0,X_1=x_1,...,X_n=x_n\}}</math> for <math>x_0,x_1,...,x_n\in E</math>. Let us first consider the case where <math>G=\one_{\{X_0=y_0,...,X_p=y_p\}}</math>. In this case, if <math>y\in E</math>, then | |||
<math display="block"> | <math display="block"> |
Latest revision as of 00:09, 9 May 2024
We start with the existence of a Markov chain associated with a given transition matrix.
Let [math](\Omega,\F,\p)[/math] be a probability space. Let [math]Q[/math] be a stochastic matrix on [math]E[/math]. There exists a probability space [math](\Omega',\F',\p')[/math] on which there exists, for all [math]x\in E[/math], a stochastic process [math](X_n^x)_{n\geq 0}[/math] which is a Markov chain with transition matrix [math]Q[/math], starting from [math]X_0^x=x[/math].
We can take [math]\Omega'=[0,1)[/math], with the Borel [math]\sigma[/math]-Algebra and the Lebesgue measure. For [math]\omega\in[0,1)[/math], we set
In the sequel, we shall take [math]\Omega=E^\N[/math]. An element [math]\omega\in \Omega[/math] is a sequence [math]\omega=(\omega_0,\omega_1,...)[/math] of elements in [math]E[/math]. Define the coordinate map
We take [math]\F[/math] to be the smallest [math]\sigma[/math]-Algebra on [math]\Omega[/math] which make all [math]X_n[/math]'s measurable. This [math]\sigma[/math]-Algebra is generated by
for [math]n\in\N[/math] and [math]x_0,...,x_n\in E[/math].
Let [math](G,\mathcal{G})[/math] be a measurable space and let [math]\psi:G\to\Omega[/math] be a map. Then [math]\psi[/math] is measurable if and only if for all [math]n\geq 0[/math], [math]X_n\circ\psi[/math] is measurable.
We only need to show that if [math]X_n\circ \psi[/math] is measurable for all [math]n\geq 0[/math], then [math]\psi[/math] is measurable as well. Consider the [math]\sigma[/math]-Algebra [math]\A[/math] on [math]\Omega[/math] given by
Let [math](\Omega,\F,\p)[/math] be a probability space. Let [math]Q[/math] be a stochastic matrix on [math]E[/math]. For all [math]x\in E[/math], there exists a unique probability measure [math]\p_x[/math], on [math]\Omega=E^\N[/math], such that the sequence of coordinate functions [math](X_n)_{n\geq 0}[/math] is a Markov chain with transition matrix [math]Q[/math] and [math]\p_x[X_0=x]=1[/math].
From proposition, there exists [math](\Omega',\F',\p')[/math] and [math](X_n^x)_{n\geq 0}[/math] which is a Markov chain with transition matrix [math]Q[/math] and such that [math]X_0^x=x[/math]. We then define [math]\p_x[/math] as the image of [math]\p'[/math] by the map
This map is measurable because of lemma. We have [math]\p_x[X_0=x]=\p'[X_0^x=x]=1[/math] and for [math]x_0,x_1,...,x_n\in E[/math] we have
Therefore it follows that the sequence of the coordinate functions is a Markov chain. For uniqueness, we note that if [math]\p'_x[/math] is another such probability measure, [math]\p_x[/math] and [math]\p_x'[/math] coincide on cylinders. It then follows from an application of the monotone class theorem that [math]\p_x=\p_x'[/math].
It follows for all [math]n\geq 0[/math] and for all [math]x,y\in E[/math], that we have
If [math]\mu[/math] is a probability measure on [math]E[/math], we write
Let [math](X'_n)_{n\geq 0}[/math] be a Markov chain with transition matrix [math]Q[/math] and initial distribution [math]\mu[/math]. Then for all measurable subsets [math]B\subset\Omega=E^\N[/math], we have
Let [math](\Omega,\F,\p)[/math] be a probability space. Let [math]F[/math] and [math]G[/math] be two measurable, positive maps on [math]\Omega[/math] and let [math]n\geq 0[/math]. Assume that [math]F[/math] is [math]\F_n[/math]-measurable. Then for all [math]x\in E[/math],
Theorem is also true if one replace [math]\E_x[/math] with [math]\E_\mu[/math].
It is enough to prove the first statement. For this we can restrict ourselves to the case where [math]F=\one_{\{ X_0=x_0,X_1=x_1,...,X_n=x_n\}}[/math] for [math]x_0,x_1,...,x_n\in E[/math]. Let us first consider the case where [math]G=\one_{\{X_0=y_0,...,X_p=y_p\}}[/math]. In this case, if [math]y\in E[/math], then
Now it follows from the monotone class theorem that the above holds for [math]G=\one_A[/math], for [math]A\in\F_n[/math] and thus we can conclude.
We would like to make sense of theorem 16.4. for [math]n[/math] replaced by a stopping time [math]T[/math]. Indeed let us assume that we want to the problem of knowing whether a Markov chain starting from [math]x[/math] visits back [math]x[/math]. In other words, let us write
Let [math](\Omega,\F,(\F_n)_{n\geq 0},\p)[/math] be a filtered probability space. Let [math]T[/math] be a stopping time for the filtration [math](\F_n)_{n\geq 0}[/math]. Let [math]F[/math] and [math]G[/math] be two measurable and positive functions on [math]\Omega[/math]. Assume further that [math]F[/math] is [math]\F_T[/math]-measurable. Then for all [math]x\in E[/math], we get
For [math]n\geq 0[/math], we have
Let [math](\Omega,\F,(\F_n)_{n\geq 0},\p)[/math] be a filtered probability space. Let [math]T[/math] be a stopping time such that [math]\p_x[T \lt \infty]=1[/math]. Let us assume that there exists [math]y\in E[/math] such that [math]\p_x[X_T=y]=1[/math]. Then under [math]\p_x[/math], [math]\Theta_T(\omega)[/math] is independent of [math]\F_T[/math] and has the law [math]\p_y[/math].
We only have to note that
General references
Moshayedi, Nima (2020). "Lectures on Probability Theory". arXiv:2010.16280 [math.PR].