⧼exchistory⧽
29 exercise(s) shown, 0 hidden
BBy Bot
Jun 09'24

A number is chosen at random from the integers [math]1, 2, 3,\ldots, n[/math]. Let [math]X[/math] be the number chosen. Show that [math]E(X) = (n + 1)/2[/math] and [math]V(X) = (n - 1)(n + 1)/12[/math]. Hint: The following identity may be useful:

[[math]] 1^2 + 2^2 + \cdots + n^2 = \frac{(n)(n+1)(2n+1)}{6}\ . [[/math]]

BBy Bot
Jun 09'24

Let [math]X[/math] be a random variable with [math]\mu = E(X)[/math] and [math]\sigma^2 = V(X)[/math]. Define [math]X^* = (X - \mu)/\sigma[/math]. The random variable [math]X^*[/math] is called the standardized random variable associated with [math]X[/math]. Show that this standardized random variable has expected value 0 and variance 1.

BBy Bot
Jun 09'24

Peter and Paul play Heads or Tails (see Example). Let [math]W_n[/math] be Peter's winnings after [math]n[/math] matches. Show that [math]E(W_n) = 0[/math] and [math]V(W_n) = n[/math].

BBy Bot
Jun 09'24

Find the expected value and the variance for the number of boys and the number of girls in a royal family that has children until there is a boy or until there are three children, whichever comes first.

BBy Bot
Jun 09'24

Suppose that [math]n[/math] people have their hats returned at random. Let [math]X_i = 1[/math] if the [math]i[/math]th person gets his or her own hat back and 0 otherwise. Let [math]S_n = \sum_{i = 1}^n X_i[/math]. Then [math]S_n[/math] is the total number of people who get their own hats back. Show that

  • [math]E(X_i^2) = 1/n[/math].
  • [math]E(X_i \cdot X_j) = 1/n(n - 1)[/math] for [math]i \ne j[/math].
  • [math]E(S_n^2) = 2[/math] (using (a) and (b)).
  • [math]V(S_n) = 1[/math].
BBy Bot
Jun 09'24

Let [math]S_n[/math] be the number of successes in [math]n[/math] independent trials. Use the program BinomialProbabilities (Combinations) to compute, for given [math]n[/math], [math]p[/math], and [math]j[/math], the probability

[[math]] P(-j\sqrt{npq} \lt S_n - np \lt j\sqrt{npq})\ . [[/math]]

  • Let [math]p = .5[/math], and compute this probability for [math]j = 1[/math], 2, 3 and [math]n = 10[/math], 30, 50. Do the same for [math]p = .2[/math].
  • Show that the standardized random variable [math]S_n^* = (S_n - np)/\sqrt{npq}[/math] has expected value 0 and variance 1. What do your results from (a) tell you about this standardized quantity [math]S_n^*[/math]?
BBy Bot
Jun 09'24

Let [math]X[/math] be the outcome of a chance experiment with [math]E(X) =\mu[/math] and [math]V(X) = \sigma^2[/math]. When [math]\mu[/math] and [math]\sigma^2[/math] are unknown, the statistician often estimates them by repeating the experiment [math]n[/math] times with outcomes [math]x_1[/math], [math]x_2, \ldots, x_n[/math], estimating [math]\mu[/math] by the sample mean

[[math]] \bar{x} = \frac 1n \sum_{i = 1}^n x_i\ , [[/math]]

and [math]\sigma^2[/math] by the sample variance

[[math]] s^2 = \frac 1n \sum_{i = 1}^n (x_i - \bar x)^2\ . [[/math]]

Then [math]s[/math] is the sample standard deviation. These formulas should remind the reader of the definitions of the theoretical mean and variance. (Many statisticians define the sample variance with the coefficient [math]1/n[/math] replaced by [math]1/(n-1)[/math]. If this alternative definition is used, the expected value of [math]s^2[/math] is equal to [math]\sigma^2[/math]. (See Exercise, part (d).) Write a computer program that will roll a die [math]n[/math] times and compute the sample mean and sample variance. Repeat this experiment several times for [math]n = 10[/math] and [math]n =1000[/math]. How well do the sample mean and sample variance estimate the true mean 7/2 and variance 35/12?

BBy Bot
Jun 09'24

Show that, for the sample mean [math]\bar x[/math] and sample variance [math]s^2[/math] as defined in Exercise,

  • [math]E(\bar x) = \mu[/math].
  • [math]E\bigl((\bar x - \mu)^2\bigr) = \sigma^2/n[/math].
  • [math]E(s^2) = \frac {n-1}n\sigma^2[/math]. Hint: For (c) write
    [[math]] \begin{eqnarray*} \sum_{i = 1}^n (x_i - \bar x)^2 & = & \sum_{i = 1}^n \bigl((x_i - \mu) - (\bar x - \mu)\bigr)^2 \\ & = & \sum_{i = 1}^n (x_i - \mu)^2 - 2(\bar x - \mu) \sum_{i = 1}^n (x_i - \mu) + n(\bar x - \mu)^2 \\ & = & \sum_{i = 1}^n (x_i - \mu)^2 - n(\bar x - \mu)^2, \end{eqnarray*} [[/math]]
    and take expectations of both sides, using part (b) when necessary.
  • Show that if, in the definition of [math]s^2[/math] in Exercise, we replace the coefficient [math]1/n[/math] by the coefficient [math]1/(n-1)[/math], then [math]E(s^2) = \sigma^2[/math]. (This shows why many statisticians use the coefficient [math]1/(n-1)[/math]. The number [math]s^2[/math] is used to estimate the unknown quantity [math]\sigma^2[/math]. If an estimator has an average value which equals the quantity being estimated, then the estimator is said to be unbiased. Thus, the statement [math]E(s^2) = \sigma^2[/math] says that [math]s^2[/math] is an unbiased estimator of [math]\sigma^2[/math].)
BBy Bot
Jun 09'24

Let [math]X[/math] be a random variable taking on values [math]a_1,a_2, \ldots, a_r[/math] with probabilities [math]p_1[/math], [math]p_2, \ldots,p_r[/math] and with [math]E(X) = \mu[/math]. Define the spread of [math]X[/math] as follows:

[[math]] \bar\sigma = \sum_{i = 1}^r |a_i - \mu|p_i\ . [[/math]]

This, like the standard deviation, is a way to quantify the amount that a random variable is spread out around its mean. Recall that the variance of a sum of mutually independent random variables is the sum of the individual variances. The square of the spread corresponds to the variance in a manner similar to the correspondence between the spread and the standard deviation. Show by an example that it is not necessarily true that the square of the spread of the sum of two independent random variables is the sum of the squares of the individual spreads.

BBy Bot
Jun 09'24

We have two instruments that measure the distance between two points. The measurements given by the two instruments are random variables [math]X_1[/math] and [math]X_2[/math] that are independent with [math]E(X_1) = E(X_2) = \mu[/math], where [math]\mu[/math] is the true distance. From experience with these instruments, we know the values of the variances [math]\sigma_1^2[/math] and [math]\sigma_2^2[/math]. These variances are not necessarily the same. From two measurements, we estimate [math]\mu[/math] by the weighted average [math]\bar \mu = wX_1 + (1 - w)X_2[/math]. Here [math]w[/math] is chosen in [math][0,1][/math] to minimize the variance of [math]\bar \mu[/math].

  • What is [math]E(\bar \mu)[/math]?
  • How should [math]w[/math] be chosen in [math][0,1][/math] to minimize the variance of [math]\bar \mu[/math]?