⧼exchistory⧽
BBy Bot
Jun 09'24

Is the Markov chain in Example ergodic?

BBy Bot
Jun 09'24

Consider Example (Drunkard's Walk). Assume that if the walker reaches state 0, he turns around and returns to state 1 on the next step and, similarly, if he reaches 4 he returns on the next step to state 3. Is this new chain ergodic? Is it regular?

BBy Bot
Jun 09'24
[math] \newcommand{\NA}{{\rm NA}} \newcommand{\mat}[1]{{\bf#1}} \newcommand{\exref}[1]{\ref{##1}} \newcommand{\secstoprocess}{\all} \newcommand{\NA}{{\rm NA}} \newcommand{\mathds}{\mathbb}[/math]

For Example when [math]\mat{P}[/math] is ergodic, what is the proportion of people who are told that the President will run? Interpret the fact that this proportion is independent of the starting state.

BBy Bot
Jun 09'24
[math] \newcommand{\NA}{{\rm NA}} \newcommand{\mat}[1]{{\bf#1}} \newcommand{\exref}[1]{\ref{##1}} \newcommand{\secstoprocess}{\all} \newcommand{\NA}{{\rm NA}} \newcommand{\mathds}{\mathbb}[/math]

Consider an independent trials process to be a Markov chain whose states are the possible outcomes of the individual trials. What is its fixed probability vector? Is the chain always regular? Illustrate this for Example.

BBy Bot
Jun 09'24
[math] \newcommand{\NA}{{\rm NA}} \newcommand{\mat}[1]{{\bf#1}} \newcommand{\exref}[1]{\ref{##1}} \newcommand{\secstoprocess}{\all} \newcommand{\NA}{{\rm NA}} \newcommand{\mathds}{\mathbb}[/math]

Show that Example is an ergodic chain, but not a regular chain. Show that its fixed probability vector [math]\mat{w}[/math] is a binomial distribution.

BBy Bot
Jun 09'24

Show that Example is regular and find the limiting vector.

BBy Bot
Jun 09'24

Toss a fair die repeatedly. Let [math]S_n[/math] denote the total of the outcomes through the [math]n[/math]th toss. Show that there is a limiting value for the proportion of the first [math]n[/math] values of [math]S_n[/math] that are divisible by 7, and compute the value for this limit. Hint: The desired limit is an equilibrium probability vector for an appropriate seven state Markov chain.

BBy Bot
Jun 09'24
[math] \newcommand{\NA}{{\rm NA}} \newcommand{\mat}[1]{{\bf#1}} \newcommand{\exref}[1]{\ref{##1}} \newcommand{\secstoprocess}{\all} \newcommand{\NA}{{\rm NA}} \newcommand{\mathds}{\mathbb}[/math]

Let [math]\mat{P}[/math] be the transition matrix of a regular Markov chain. Assume that there are [math]r[/math] states and let [math]N(r)[/math] be the smallest integer [math]n[/math] such that [math]\mat{P}[/math] is regular if and only if [math]\mat {P}^{N(r)}[/math] has no zero entries. Find a finite upper bound for [math]N(r)[/math]. See if you can determine [math]N(3)[/math] exactly.

BBy Bot
Jun 09'24
[math] \newcommand{\NA}{{\rm NA}} \newcommand{\mat}[1]{{\bf#1}} \newcommand{\exref}[1]{\ref{##1}} \newcommand{\secstoprocess}{\all} \newcommand{\NA}{{\rm NA}} \newcommand{\mathds}{\mathbb}[/math]

Define [math]f(r)[/math] to be the smallest integer [math]n[/math] such that for all regular Markov chains with [math]r[/math] states, the [math]n[/math]th power of the transition matrix has all entries positive. It has been shown,[Notes 1] that [math]f(r) = r^2 - 2r + 2[/math].

  • Define the transition matrix of an [math]r[/math]-state Markov chain as follows: For states [math]s_i[/math], with [math]i = 1[/math], 2, \ldots, [math]r - 2[/math], [math]\mat {P}(i,i + 1) = 1[/math], [math]\mat {P}(r - 1,r) = \mat {P}(r - 1, 1) = 1/2[/math], and [math]\mat {P}(r,1) = 1[/math]. Show that this is a regular Markov chain.
  • For [math]r = 3[/math], verify that the fifth power is the first power that has no zeros.
  • Show that, for general [math]r[/math], the smallest [math]n[/math] such that [math]\mat {P}^n[/math] has all entries positive is [math]n = f(r)[/math].

Notes

  1. E. Seneta, Non-Negative Matrices: An Introduction to Theory and Applications, Wiley, New York, 1973, pp. 52-54.
BBy Bot
Jun 09'24
[math] \newcommand{\NA}{{\rm NA}} \newcommand{\mat}[1]{{\bf#1}} \newcommand{\exref}[1]{\ref{##1}} \newcommand{\secstoprocess}{\all} \newcommand{\NA}{{\rm NA}} \newcommand{\mathds}{\mathbb}[/math]

A discrete time queueing system of capacity [math]n[/math] consists of the person being served and those waiting to be served. The queue length [math]x[/math] is observed each second. If [math]0 \lt x \lt n[/math], then with probability [math]p[/math], the queue size is increased by one by an arrival and, inependently, with probability [math]r[/math], it is decreased by one because the person being served finishes service. If [math]x = 0[/math], only an arrival (with probability [math]p[/math]) is possible. If [math]x= n[/math], an arrival will depart without waiting for service, and so only the departure (with probability [math]r[/math]) of the person being served is possible. Form a Markov chain with states given by the number of customers in the queue. Modify the program FixedVector so that you can input [math]n[/math], [math]p[/math], and [math]r[/math], and the program will construct the transition matrix and compute the fixed vector. The quantity [math]s = p/r[/math] is called the traffic intensity. Describe the differences in the fixed vectors according as [math]s \lt 1[/math], [math]s = 1[/math], or [math]s \gt 1[/math].