exercise:239b77ed10: Difference between revisions

From Stochiki
(Created page with "<div class="d-none"><math> \newcommand{\NA}{{\rm NA}} \newcommand{\mat}[1]{{\bf#1}} \newcommand{\exref}[1]{\ref{##1}} \newcommand{\secstoprocess}{\all} \newcommand{\NA}{{\rm NA}} \newcommand{\mathds}{\mathbb}</math></div> Let <math>X</math> be a random variable taking on values <math>a_1</math>, <math>a_2</math>, \dots, <math>a_r</math> with probabilities <math>p_1</math>, <math>p_2</math>, \dots, <math>p_r</math> and with <math>E(X) = \mu</math>. Define the ''s...")
 
No edit summary
 
Line 1: Line 1:
<div class="d-none"><math>
Let <math>X</math> be a random variable taking on values <math>a_1,a_2, \ldots, a_r</math> with probabilities <math>p_1</math>, <math>p_2, \ldots,p_r</math> and with <math>E(X) = \mu</math>. Define the  ''spread'' of <math>X</math> as follows:
\newcommand{\NA}{{\rm NA}}
\newcommand{\mat}[1]{{\bf#1}}
\newcommand{\exref}[1]{\ref{##1}}
\newcommand{\secstoprocess}{\all}
\newcommand{\NA}{{\rm NA}}
\newcommand{\mathds}{\mathbb}</math></div> Let <math>X</math> be a random variable taking on values <math>a_1</math>, <math>a_2</math>,  
\dots, <math>a_r</math> with probabilities <math>p_1</math>, <math>p_2</math>, \dots, <math>p_r</math> and with <math>E(X) = \mu</math>.  
Define the  ''spread'' of <math>X</math> as follows:


<math display="block">
<math display="block">

Latest revision as of 22:15, 14 June 2024

Let [math]X[/math] be a random variable taking on values [math]a_1,a_2, \ldots, a_r[/math] with probabilities [math]p_1[/math], [math]p_2, \ldots,p_r[/math] and with [math]E(X) = \mu[/math]. Define the spread of [math]X[/math] as follows:

[[math]] \bar\sigma = \sum_{i = 1}^r |a_i - \mu|p_i\ . [[/math]]

This, like the standard deviation, is a way to quantify the amount that a random variable is spread out around its mean. Recall that the variance of a sum of mutually independent random variables is the sum of the individual variances. The square of the spread corresponds to the variance in a manner similar to the correspondence between the spread and the standard deviation. Show by an example that it is not necessarily true that the square of the spread of the sum of two independent random variables is the sum of the squares of the individual spreads.