BBy Bot
Jun 09'24

Exercise

Let [math]X[/math] be a random variable taking on values [math]a_1,a_2, \ldots, a_r[/math] with probabilities [math]p_1[/math], [math]p_2, \ldots,p_r[/math] and with [math]E(X) = \mu[/math]. Define the spread of [math]X[/math] as follows:

[[math]] \bar\sigma = \sum_{i = 1}^r |a_i - \mu|p_i\ . [[/math]]

This, like the standard deviation, is a way to quantify the amount that a random variable is spread out around its mean. Recall that the variance of a sum of mutually independent random variables is the sum of the individual variances. The square of the spread corresponds to the variance in a manner similar to the correspondence between the spread and the standard deviation. Show by an example that it is not necessarily true that the square of the spread of the sum of two independent random variables is the sum of the squares of the individual spreads.