⧼exchistory⧽
9 exercise(s) shown, 0 hidden
BBy Bot
Jun 09'24
[math] \newcommand{\NA}{{\rm NA}} \newcommand{\mat}[1]{{\bf#1}} \newcommand{\exref}[1]{\ref{##1}} \newcommand{\secstoprocess}{\all} \newcommand{\NA}{{\rm NA}} \newcommand{\mathds}{\mathbb}[/math]

Define [math]\mat{P}[/math] and [math]\mat{y}[/math] by

[[math]] \mat {P} = \pmatrix{ .5 & .5 \cr.25 & .75 }, \qquad \mat {y} = \pmatrix{ 1 \cr 0 }\ . [[/math]]

Compute [math]\mat {P}\mat{y}[/math], [math]\mat {P}^2 \mat {y}[/math], and [math]\mat {P}^4 \mat {y}[/math] and show that the results are approaching a constant vector. What is this vector?

BBy Bot
Jun 09'24
[math] \newcommand{\NA}{{\rm NA}} \newcommand{\mat}[1]{{\bf#1}} \newcommand{\exref}[1]{\ref{##1}} \newcommand{\secstoprocess}{\all} \newcommand{\NA}{{\rm NA}} \newcommand{\mathds}{\mathbb}[/math]

Let [math]\mat{P}[/math] be a regular [math]r \times r[/math] transition matrix and [math]\mat{y}[/math] any [math]r[/math]-component column vector. Show that the value of the limiting constant vector for [math]\mat {P}^n \mat {y}[/math] is [math]\mat{w}\mat{y}[/math].

BBy Bot
Jun 09'24
[math] \newcommand{\NA}{{\rm NA}} \newcommand{\mat}[1]{{\bf#1}} \newcommand{\exref}[1]{\ref{##1}} \newcommand{\secstoprocess}{\all} \newcommand{\NA}{{\rm NA}} \newcommand{\mathds}{\mathbb}[/math]

Let

[[math]] \mat {P} = \pmatrix{ 1 & 0 & 0 \cr .25 & 0 & .75 \cr 0 & 0 & 1 } [[/math]]

be a transition matrix of a Markov chain. Find two fixed vectors of [math]\mat {P}[/math] that are linearly independent. Does this show that the Markov chain is not regular?

BBy Bot
Jun 09'24

Describe the set of all fixed column vectors for the chain given in Exercise.

BBy Bot
Jun 09'24
[math] \newcommand{\NA}{{\rm NA}} \newcommand{\mat}[1]{{\bf#1}} \newcommand{\exref}[1]{\ref{##1}} \newcommand{\secstoprocess}{\all} \newcommand{\NA}{{\rm NA}} \newcommand{\mathds}{\mathbb}[/math]

The theorem that [math]\mat {P}^n \to \mat {W}[/math] was proved only for the case that [math]\mat{P}[/math] has no zero entries. Fill in the details of the following extension to the case that [math]\mat{P}[/math] is regular. Since [math]\mat{P}[/math] is regular, for some [math]N, \mat {P}^N[/math] has no zeros. Thus, the proof given shows that [math]M_{nN} - m_{nN}[/math] approaches 0 as [math]n[/math] tends to infinity. However, the difference [math]M_n - m_n[/math] can never increase. (Why?) Hence, if we know that the differences obtained by looking at every [math]N[/math]th time tend to 0, then the entire sequence must also tend to 0.

BBy Bot
Jun 09'24
[math] \newcommand{\NA}{{\rm NA}} \newcommand{\mat}[1]{{\bf#1}} \newcommand{\exref}[1]{\ref{##1}} \newcommand{\secstoprocess}{\all} \newcommand{\NA}{{\rm NA}} \newcommand{\mathds}{\mathbb}[/math]

Let [math]\mat{P}[/math] be a regular transition matrix and let [math]\mat{w}[/math] be the unique non-zero fixed vector of [math]\mat{P}[/math]. Show that no entry of [math]\mat{w}[/math] is 0.

BBy Bot
Jun 09'24
[math] \newcommand{\NA}{{\rm NA}} \newcommand{\mat}[1]{{\bf#1}} \newcommand{\exref}[1]{\ref{##1}} \newcommand{\secstoprocess}{\all} \newcommand{\NA}{{\rm NA}} \newcommand{\mathds}{\mathbb}[/math]

Here is a trick to try on your friends. Shuffle a deck of cards and deal them out one at a time. Count the face cards each as ten. Ask your friend to look at one of the first ten cards; if this card is a six, she is to look at the card that turns up six cards later; if this card is a three, she is to look at the card that turns up three cards later, and so forth. Eventually she will reach a point where she is to look at a card that turns up [math]x[/math] cards later but there are not [math]x[/math] cards left. You then tell her the last card that she looked at even though you did not know her starting point. You tell her you do this by watching her, and she cannot disguise the times that she looks at the cards. In fact you just do the same procedure and, even though you do not start at the same point as she does, you will most likely end at the same point. Why?

BBy Bot
Jun 09'24
[math] \newcommand{\NA}{{\rm NA}} \newcommand{\mat}[1]{{\bf#1}} \newcommand{\exref}[1]{\ref{##1}} \newcommand{\secstoprocess}{\all} \newcommand{\NA}{{\rm NA}} \newcommand{\mathds}{\mathbb}[/math]

Write a program to play the game in Exercise.

BBy Bot
Jun 09'24
[math] \newcommand{\NA}{{\rm NA}} \newcommand{\mat}[1]{{\bf#1}} \newcommand{\exref}[1]{\ref{##1}} \newcommand{\secstoprocess}{\all} \newcommand{\NA}{{\rm NA}} \newcommand{\mathds}{\mathbb}[/math]

(Suggested by Peter Doyle) In the proof of Theorem, we assumed the existence of a fixed vector [math]\mat{w}[/math]. To avoid this assumption, beef up the coupling argument to show (without assuming the existence of a stationary distribution [math]\mat{w}[/math]) that for appropriate constants [math]C[/math] and [math]r \lt 1[/math], the distance between [math]\alpha P^n[/math] and [math]\beta P^n[/math] is at most [math]C r^n[/math] for any starting distributions [math]\alpha[/math] and [math]\beta[/math]. Apply this in the case where [math]\beta = \alpha P[/math] to conclude that the sequence [math]\alpha P^n[/math] is a Cauchy sequence, and that its limit is a matrix [math]W[/math] whose rows are all equal to a probability vector [math]w[/math] with [math]wP=w[/math]. Note that the distance between [math]\alpha P^n[/math] and [math]w[/math] is at most [math]C r^n[/math], so in freeing ourselves from the assumption about having a fixed vector we've proved that the convergence to equilibrium takes place exponentially fast.