Assume that a man's profession can be classified as professional, skilled laborer, or unskilled laborer. Assume that, of the sons of professional men, 80 percent are professional, 10 percent are skilled laborers, and 10 percent are unskilled laborers. In the case of sons of skilled laborers, 60 percent are skilled laborers, 20 percent are professional, and 20 percent are unskilled. Finally, in the case of unskilled laborers, 50 percent of the sons are unskilled laborers, and 25 percent each are in the other two categories. Assume that every man has at least one son, and form a Markov chain by following the profession of a randomly chosen son of a given family through several generations. Set up the matrix of transition probabilities. Find the probability that a randomly chosen grandson of an unskilled laborer is a professional man.
In Exercise, we assumed that every man has a son. Assume instead that the probability that a man has at least one son is .8. Form a Markov chain with four states. If a man has a son, the probability that this son is in a particular profession is the same as in Exercise. If there is no son, the process moves to state four which represents families whose male line has died out. Find the matrix of transition probabilities and find the probability that a randomly chosen grandson of an unskilled laborer is a professional man.
Write a program to compute [math]\mat {u}^{(n)}[/math] given [math]\mat {u}[/math] and [math]\mat{P}[/math]. Use this program to compute [math]\mat {u}^{(10)}[/math] for the Land of Oz example, with [math]\mat {u} = (0, 1, 0)[/math], and with [math]\mat {u} = (1/3, 1/3, 1/3)[/math].
Using the program MatrixPowers, find [math]\mat {P}^1[/math] through [math]\mat {P}^6[/math] for Example and example. See if you can predict the long-range probability of finding the process in each of the states for these examples.
Write a program to simulate the outcomes of a Markov chain after [math]n[/math] steps, given the initial starting state and the transition matrix [math]\mat{P}[/math] as data (see Example). Keep this program for use in later problems.
Modify the program of Exercise Exercise so that it keeps track of the proportion of times in each state in [math]n[/math] steps. Run the modified program for different starting states for Example and Example. Does the initial state affect the proportion of time spent in each of the states if [math]n[/math] is large?
Consider the following process. We have two coins, one
of which is fair, and the other of which has heads on both sides. We give these two coins to our friend, who chooses one of them at random (each with probability 1/2). During the rest of the process, she uses only the coin that she chose. She now proceeds to toss the coin many times, reporting the results. We consider this process to consist solely of what she reports to us.
- Given that she reports a head on the [math]n[/math]th toss, what is the probability that a head is thrown on the [math](n+1)[/math]st toss?
- Consider this process as having two states, heads and tails. By computing the other three transition probabilities analogous to the one in part (a), write down a “transition matrix” for this process.
- Now assume that the process is in state “heads” on both the [math](n-1)[/math]st and the [math]n[/math]th toss. Find the probability that a head comes up on the [math](n+1)[/math]st toss.
- Is this process a Markov chain?