BBy Bot
Jun 09'24

Exercise

Show that, for the sample mean [math]\bar x[/math] and sample variance [math]s^2[/math] as defined in Exercise,

  • [math]E(\bar x) = \mu[/math].
  • [math]E\bigl((\bar x - \mu)^2\bigr) = \sigma^2/n[/math].
  • [math]E(s^2) = \frac {n-1}n\sigma^2[/math]. Hint: For (c) write
    [[math]] \begin{eqnarray*} \sum_{i = 1}^n (x_i - \bar x)^2 & = & \sum_{i = 1}^n \bigl((x_i - \mu) - (\bar x - \mu)\bigr)^2 \\ & = & \sum_{i = 1}^n (x_i - \mu)^2 - 2(\bar x - \mu) \sum_{i = 1}^n (x_i - \mu) + n(\bar x - \mu)^2 \\ & = & \sum_{i = 1}^n (x_i - \mu)^2 - n(\bar x - \mu)^2, \end{eqnarray*} [[/math]]
    and take expectations of both sides, using part (b) when necessary.
  • Show that if, in the definition of [math]s^2[/math] in Exercise, we replace the coefficient [math]1/n[/math] by the coefficient [math]1/(n-1)[/math], then [math]E(s^2) = \sigma^2[/math]. (This shows why many statisticians use the coefficient [math]1/(n-1)[/math]. The number [math]s^2[/math] is used to estimate the unknown quantity [math]\sigma^2[/math]. If an estimator has an average value which equals the quantity being estimated, then the estimator is said to be unbiased. Thus, the statement [math]E(s^2) = \sigma^2[/math] says that [math]s^2[/math] is an unbiased estimator of [math]\sigma^2[/math].)