BBy Bot
Jun 09'24

Exercise

A surveyor is measuring the height of a cliff known to be about 1000 feet. He assumes his instrument is properly calibrated and that his measurement errors are independent, with mean [math]\mu = 0[/math] and variance [math]\sigma^2 = 10[/math]. He plans to take [math]n[/math] measurements and form the average. Estimate, using (a) Chebyshev's inequality and (b) the normal approximation, how large [math]n[/math] should be if he wants to be 95 percent sure that his average falls within 1 foot of the true value. Now estimate, using (a) and (b), what value should [math]\sigma^2[/math] have if he wants to make only 10 measurements with the same confidence?