⧼exchistory⧽
BBy Bot
Jun 09'24
[math] \newcommand{\NA}{{\rm NA}} \newcommand{\mat}[1]{{\bf#1}} \newcommand{\exref}[1]{\ref{##1}} \newcommand{\secstoprocess}{\all} \newcommand{\NA}{{\rm NA}} \newcommand{\mathds}{\mathbb}[/math]

(Alternate proof of Theorem) Let [math]\mat{P}[/math] be the transition matrix of an ergodic Markov chain. Let [math]\mat{x}[/math] be any column vector such that

[math]\mat{P} \mat{x} = \mat{ x}[/math]. Let [math]M[/math] be the maximum value of the components of [math]\mat{x}[/math]. Assume that [math]x_i = M[/math]. Show that if [math]p_{ij} \gt 0[/math] then [math]x_j = M[/math]. Use this to prove that [math]\mat{x}[/math] must be a constant vector.

BBy Bot
Jun 09'24
[math] \newcommand{\NA}{{\rm NA}} \newcommand{\mat}[1]{{\bf#1}} \newcommand{\exref}[1]{\ref{##1}} \newcommand{\secstoprocess}{\all} \newcommand{\NA}{{\rm NA}} \newcommand{\mathds}{\mathbb}[/math]

Let [math]\mat{P}[/math] be the transition matrix of an ergodic Markov chain. Let [math]\mat{w}[/math] be a fixed probability vector (i.e., [math]\mat{w}[/math] is a row vector with [math]\mat {w}\mat {P} = \mat {w}[/math]). Show that if [math]w_i = 0[/math] and [math]p_{ji} \gt 0[/math] then [math]w_j = 0[/math]. Use this to show that the fixed probability vector for an ergodic chain cannot have any 0 entries.

BBy Bot
Jun 09'24

Find a Markov chain that is neither absorbing or ergodic.