⧼exchistory⧽
BBy Bot
Jun 09'24

The letters between Pascal and Fermat, which are often credited with having started probability theory, dealt mostly with the problem of points described in Exercise. Pascal and Fermat considered the problem of finding a fair division of stakes if the game must be called off when the first player has won [math]r[/math] games and the second player has won [math]s[/math] games, with [math]r \lt N[/math] and [math]s \lt N[/math]. Let [math]P(r,s)[/math] be the probability that player A wins the game if he has already won [math]r[/math] points and player B has won [math]s[/math] points. Then

  • [math]P(r,N) = 0[/math] if [math]r \lt N[/math],
  • [math]P(N,s) = 1[/math] if [math]s \lt N[/math],
  • [math]P(r,s) = pP(r + 1,s) + qP(r,s + 1)[/math] if [math]r \lt N[/math] and [math]s \lt N[/math];

and (1), (2), and (3) determine [math]P(r,s)[/math] for [math]r \leq N[/math] and [math]s \leq N[/math]. Pascal used these facts to find [math]P(r,s)[/math] by working backward: He first obtained [math]P(N - 1,j)[/math] for [math]j = N - 1,N - 2, \ldots, 0[/math]; then, from these values, he obtained

[math]P(N - 2,j)[/math] for [math]j = N - 1, N - 2, \ldots, 0[/math] and, continuing backward, obtained all the values [math]P(r,s)[/math]. Write a program to compute [math]P(r,s)[/math] for given [math]N[/math], [math]a[/math], [math]b[/math], and [math]p[/math].

Follow Pascal and you will be able to run [math]N = 100[/math]; use recursion and you will not be able to run [math]N = 20[/math].
BBy Bot
Jun 09'24

Fermat solved the problem of points (see Exercise) as follows: He realized that the problem was difficult because the possible ways the play might go are not equally likely. For example, when the first player needs two more games and the second needs three to win, two possible ways the series might go for the first player are WLW and LWLW. These sequences are not equally likely. To avoid this difficulty, Fermat extended the play, adding fictitious plays so that the series went the maximum number of games needed (four in this case). He obtained equally likely outcomes and used, in effect, the Pascal triangle to calculate [math]P(r,s)[/math]. Show that this leads to a formula for [math]P(r,s)[/math] even for the case [math]p \ne 1/2[/math].

BBy Bot
Jun 09'24

The Yankees are playing the Dodgers in a world series. The Yankees win each game with probability .6. What is the probability that the Yankees win the series? (The series is won by the first team to win four games.)

BBy Bot
Jun 09'24

C. L. Anderson[Notes 1] has used Fermat's argument for the problem of points to prove the following result due to J. G. Kingston. You are playing the game of points (see Exercise Exercise) but, at each point, when you serve you win with probability [math]p[/math], and when your opponent serves you win with probability [math]\bar{p}[/math]. You will serve first, but you can choose one of the following two conventions for serving: for the first convention you alternate service (tennis), and for the second the person serving continues to serve until he loses a point and then the other player serves (racquetball). The first player to win [math]N[/math] points wins the game. The problem is to show that the probability of winning the game is the same under either convention.

  • Show that, under either convention, you will serve at most [math]N[/math] points and your opponent at most [math]N - 1[/math] points.
  • Extend the number of points to [math]2N - 1[/math] so that you serve [math]N[/math] points and your opponent serves [math]N - 1[/math]. For example, you serve any additional points necessary to make [math]N[/math] serves and then your opponent serves any additional points necessary to make him serve [math]N - 1[/math] points. The winner is now the person, in the extended game, who wins the most points. Show that playing these additional points has not changed the winner.
  • Show that (a) and (b) prove that you have the same probability of winning the game under either convention.

Notes

  1. C. L. Anderson, “Note on the Advantage of First Serve,” Journal of Combinatorial Theory, Series A, vol. 23 (1977), p. 363.
BBy Bot
Jun 09'24

In the previous problem, assume that [math]p = 1 - \bar{p}[/math].

  • Show that under either service convention, the first player will win more often than the second player if and only if [math]p \gt .5[/math].
  • In volleyball, a team can only win a point while it is serving. Thus, any individual “play” either ends with a point being awarded to the serving team or with the service changing to the other team. The first team to win [math]N[/math] points wins the game. (We ignore here the additional restriction that the winning team must be ahead by at least two points at the end of the game.) Assume that each team has the same probability of winning the play when it is serving, i.e., that [math]p = 1 - \bar{p}[/math]. Show that in this case, the team that serves first will win more than half the time, as long as [math]p \gt 0[/math]. (If [math]p = 0[/math], then the game never ends.) Hint: Define [math]p'[/math] to be the probability that a team wins the next point, given that it is serving. If we write [math]q = 1 - p[/math], then one can show that
    [[math]] p' = \frac p{1-q^2}\ . [[/math]]
    If one now considers this game in a slightly different way, one can see that the second service convention in the preceding problem can be used, with [math]p[/math] replaced by [math]p'[/math].
BBy Bot
Jun 09'24

A poker hand consists of 5 cards dealt from a deck of 52 cards. Let [math]X[/math] and [math]Y[/math] be, respectively, the number of aces and kings in a poker hand. Find the joint distribution of [math]X[/math] and [math]Y[/math].

BBy Bot
Jun 09'24

Let [math]X_1[/math] and [math]X_2[/math] be independent random variables and let [math]Y_1 = \phi_1(X_1)[/math] and [math]Y_2 = \phi_2(X_2)[/math].

  • Show that
    [[math]] P(Y_1 = r, Y_2 = s) = \sum_{\phi_1(a) = r \atop \phi_2(b) = s} P(X_1 = a, X_2 = b)\ . [[/math]]
  • Using (a), show that [math]P(Y_1 = r, Y_2 = s) = P(Y_1 = r)P(Y_2 = s)[/math] so that [math]Y_1[/math] and [math]Y_2[/math] are independent.
BBy Bot
Jun 09'24

Let [math]\Omega[/math] be the sample space of an experiment. Let [math]E[/math] be an event with [math]P(E) \gt 0[/math] and define [math]m_E(\omega)[/math] by [math]m_E(\omega) = m(\omega|E)[/math]. Prove that [math]m_E(\omega)[/math] is a distribution function on [math]E[/math], that is, that [math]m_E(\omega) \geq 0[/math] and that [math]\sum_{\omega\in\Omega} m_E(\omega) = 1[/math]. The function [math]m_E[/math] is called the conditional distribution given [math]E[/math].

BBy Bot
Jun 09'24

You are given two urns each containing two biased coins. The coins in urn I come up heads with probability [math]p_1[/math], and the coins in urn II come up heads with probability [math]p_2 \ne p_1[/math]. You are given a choice of (a) choosing an urn at random and tossing the two coins in this urn or (b) choosing one coin from each urn and tossing these two coins. You win a prize if both coins turn up heads. Show that you are better off selecting choice (a).

BBy Bot
Jun 09'24

Prove that, if [math]A_1, A_2, \ldots,A_n[/math] are independent events defined on a sample space [math]\Omega[/math] and if [math]0 \lt P(A_j) \lt 1[/math] for all [math]j[/math], then [math]\Omega[/math] must have at least [math]2^n[/math] points.