⧼exchistory⧽
17 exercise(s) shown, 0 hidden
BBy Bot
Jun 09'24
[math] \newcommand{\NA}{{\rm NA}} \newcommand{\mat}[1]{{\bf#1}} \newcommand{\exref}[1]{\ref{##1}} \newcommand{\secstoprocess}{\all} \newcommand{\NA}{{\rm NA}} \newcommand{\mathds}{\mathbb}[/math]

We have two coins: one is a fair coin and the other is a coin that

produces heads with probability 3/4. One of the two coins is picked at random, and this coin is tossed [math]n[/math] times. Let [math]S_n[/math] be the number of heads that turns up in these [math]n[/math] tosses. Does the Law of Large Numbers allow us to predict the proportion of heads that will turn up in the long run? After we have observed a large number of tosses, can we tell which coin was chosen? How many tosses suffice to make us 95 percent sure?

BBy Bot
Jun 09'24
[math] \newcommand{\NA}{{\rm NA}} \newcommand{\mat}[1]{{\bf#1}} \newcommand{\exref}[1]{\ref{##1}} \newcommand{\secstoprocess}{\all} \newcommand{\NA}{{\rm NA}} \newcommand{\mathds}{\mathbb}[/math]

(Chebyshev[Notes 1]) Assume

that [math]X_1[/math], [math]X_2[/math], \dots, [math]X_n[/math] are independent random variables with possibly different distributions and let [math]S_n[/math] be their sum. Let [math]m_k = E(X_k)[/math], [math]\sigma_k^2 = V(X_k)[/math], and [math]M_n = m_1 + m_2 +\cdots+ m_n[/math]. Assume that [math]\sigma_k^2 \lt R[/math] for all [math]k[/math]. Prove that, for any [math]\epsilon \gt 0[/math],

[[math]] P\left( \left| \frac {S_n}n - \frac {M_n}n \right| \lt \epsilon \right) \to 1 [[/math]]

as [math]n \rightarrow \infty[/math].

Notes

  1. P. L. Chebyshev, “On Mean Values,” J.\ Math.\ Pure.\ Appl., vol. 12 (1867), pp. 177--184.
BBy Bot
Jun 09'24
[math] \newcommand{\NA}{{\rm NA}} \newcommand{\mat}[1]{{\bf#1}} \newcommand{\exref}[1]{\ref{##1}} \newcommand{\secstoprocess}{\all} \newcommand{\NA}{{\rm NA}} \newcommand{\mathds}{\mathbb}[/math]

A fair coin is tossed repeatedly. Before each toss, you are allowed

to decide whether to bet on the outcome. Can you describe a betting system with infinitely many bets which will enable you, in the long run, to win more than half of your bets? (Note that we are disallowing a betting system that says to bet until you are ahead, then quit.) Write a computer program that implements this betting system. As stated above, your program must decide whether to bet on a particular outcome before that outcome is determined. For example, you might select only outcomes that come after there have been three tails in a row. See if you can get more than 50\% heads by your “system.”

BBy Bot
Jun 09'24
[math] \newcommand{\NA}{{\rm NA}} \newcommand{\mat}[1]{{\bf#1}} \newcommand{\exref}[1]{\ref{##1}} \newcommand{\secstoprocess}{\all} \newcommand{\NA}{{\rm NA}} \newcommand{\mathds}{\mathbb}[/math]

Prove the following analogue of Chebyshev's Inequality:

[[math]] P(|X - E(X)| \geq \epsilon) \leq \frac 1\epsilon E(|X - E(X)|)\ . [[/math]]

BBy Bot
Jun 09'24
[math] \newcommand{\NA}{{\rm NA}} \newcommand{\mat}[1]{{\bf#1}} \newcommand{\exref}[1]{\ref{##1}} \newcommand{\secstoprocess}{\all} \newcommand{\NA}{{\rm NA}} \newcommand{\mathds}{\mathbb}[/math]

We have proved a theorem often called the “Weak Law of

Large Numbers.” Most people's intuition and our computer simulations suggest that, if we toss a coin a sequence of times, the proportion of heads will really approach 1/2; that is, if [math]S_n[/math] is the number of heads in [math]n[/math] times, then we will have

[[math]] A_n = \frac {S_n}n \to \frac 12 [[/math]]

as [math]n \to \infty[/math]. Of course, we cannot be sure of this since we are not able to toss the coin an infinite number of times, and, if we could, the coin could come up heads every time. However, the “Strong Law of Large Numbers,” proved in more advanced courses, states that

[[math]] P\left( \frac {S_n}n \to \frac 12 \right) = 1\ . [[/math]]

Describe a sample space [math]\Omega[/math] that would make it possible for us to talk about the event

[[math]] E = \left\{\, \omega : \frac {S_n}n \to \frac 12\, \right\}\ . [[/math]]

Could we assign the equiprobable measure to this space? \choice{}{(See Example.)}

BBy Bot
Jun 09'24
[math] \newcommand{\NA}{{\rm NA}} \newcommand{\mat}[1]{{\bf#1}} \newcommand{\exref}[1]{\ref{##1}} \newcommand{\secstoprocess}{\all} \newcommand{\NA}{{\rm NA}} \newcommand{\mathds}{\mathbb}[/math]

In this exercise, we shall construct an example of a sequence of random variables that satisfies the weak law of large numbers, but not the strong law. The distribution of [math]X_i[/math] will have to depend on [math]i[/math], because otherwise both laws would be satisfied. (This problem was communicated to us by David Maslen.) \vskip .1in Suppose we have an infinite sequence of mutually independent events [math]A_1, A_2, \ldots[/math]. Let [math]a_i = P(A_i)[/math], and let [math]r[/math] be a positive integer.

  • Find an expression of the probability that none of the [math]A_i[/math] with [math]i \gt r[/math] occur.
  • Use the fact that [math]x-1 \leq e^{-x}[/math] to show that
    [[math]] P(\mbox{No\ $A_i$\ with\ \ltmath\gti \gt r[[/math]]
    \ occurs}) \leq e^{-\sum_{i=r}^{\infty} a_i} </math>
  • (The first Borel-Cantelli lemma) Prove that if [math]\sum_{i=1}^{\infty} a_i[/math] diverges, then
    [[math]] P(\mbox{infinitely\ many\ $A_i$\ occur}) = 1. [[/math]]
    \vskip .1in Now, let [math]X_i[/math] be a sequence of mutually independent random variables such that for each positive integer [math]i \geq 2[/math],
    [[math]] P(X_i = i) = \frac{1}{2i\log i}, \quad P(X_i = -i) = \frac{1}{2i\log i}, \quad P(X_i =0) = 1 - \frac{1}{i \log i}. [[/math]]
    When [math]i=1[/math] we let [math]X_i=0[/math] with probability [math]1[/math]. As usual we let [math]S_n = X_1 + \cdots + X_n[/math]. Note that the mean of each [math]X_i[/math] is [math]0[/math].
  • Find the variance of [math]S_n[/math].
  • Show that the sequence [math]\langle X_i \rangle[/math] satisfies the Weak Law of Large Numbers, i.e. prove that for any [math]\epsilon \gt 0[/math]
    [[math]] P\biggl(\biggl|{\frac{S_n}{n}}\biggr| \geq \epsilon\biggr) \rightarrow 0\ , [[/math]]
    as [math]n[/math] tends to infinity. \vskip .1in We now show that [math]\{ X_i \}[/math] does not satisfy the Strong Law of Large Numbers. Suppose that [math]S_n / n \rightarrow 0[/math]. Then because
    [[math]] \frac{X_n}{n} = \frac{S_n}{n} - \frac{n-1}{n} \frac{S_{n-1}}{n-1}\ , [[/math]]
    we know that [math]X_n / n \rightarrow 0[/math]. From the definition of limits, we conclude that the inequality [math]|X_i| \geq \frac{1}{2} i[/math] can only be true for finitely many [math]i[/math].
  • Let [math]A_i[/math] be the event [math]|X_i| \geq \frac{1}{2} i[/math]. Find [math]P(A_i)[/math]. Show that [math]\sum_{i=1}^{\infty} P(A_i)[/math] diverges (use the Integral Test).
  • Prove that [math]A_i[/math] occurs for infinitely many [math]i[/math].
  • Prove that
    [[math]] P\biggl(\frac{S_n}{n} \rightarrow 0\biggr) = 0, [[/math]]
    and hence that the Strong Law of Large Numbers fails for the sequence [math]\{ X_i \}[/math].
BBy Bot
Jun 09'24
[math] \newcommand{\NA}{{\rm NA}} \newcommand{\mat}[1]{{\bf#1}} \newcommand{\exref}[1]{\ref{##1}} \newcommand{\secstoprocess}{\all} \newcommand{\NA}{{\rm NA}} \newcommand{\mathds}{\mathbb}[/math]

Let us toss a biased coin that comes up heads with probability [math]p[/math]

and assume the validity of the Strong Law of Large Numbers as described in Exercise Exercise. Then, with probability 1,

[[math]] \frac {S_n}n \to p [[/math]]

as [math]n \to \infty[/math]. If [math]f(x)[/math] is a continuous function on the unit interval, then we also have

[[math]] f\left( \frac {S_n}n \right) \to f(p)\ . [[/math]]

Finally, we could hope that

[[math]] E\left(f\left( \frac {S_n}n \right)\right) \to E(f(p)) = f(p)\ . [[/math]]

Show that, if all this is correct, as in fact it is, we would have proven that any continuous function on the unit interval is a limit of polynomial functions. This is a sketch of a probabilistic proof of an important theorem in mathematics called the Weierstrass approximation theorem.