guide:1b8642f694: Difference between revisions

From Stochiki
No edit summary
mNo edit summary
 
(2 intermediate revisions by the same user not shown)
Line 1: Line 1:
A ''random variable'', ''random quantity'', ''aleatory variable'' or ''stochastic variable'' is a [[wikipedia:Variable (mathematics)|variable]] whose value is subject to variations due to chance (i.e. [[wikipedia:randomness|randomness]], in a mathematical sense).<ref name="Yates">{{cite book | last = Yates | first = Daniel S. | last2 = Moore | first2 = David S | last3 = Starnes | first3 = Daren S. | year = 2003 | title = The Practice of Statistics | edition = 2nd | publisher = [[wikipedia:W. H. Freeman and Company|Freeman]] | location = New York | url = http://bcs.whfreeman.com/yates2e/ | isbn = 978-0-7167-4773-4}}</ref>{{rp|391}} A random variable can take on a set of possible different values (similarly to other mathematical variables), each with an associated probability, in contrast to other mathematical variables.
A ''random variable'', ''random quantity'', ''aleatory variable'' or ''stochastic variable'' is a [[Variable (mathematics)|variable]] whose value is subject to variations due to chance (i.e. [[randomness|randomness]], in a mathematical sense).<ref name="Yates">{{cite book | last = Yates | first = Daniel S. | last2 = Moore | first2 = David S | last3 = Starnes | first3 = Daren S. | year = 2003 | title = The Practice of Statistics | edition = 2nd | publisher = [[W. H. Freeman and Company|Freeman]] | location = New York | url = http://bcs.whfreeman.com/yates2e/ | isbn = 978-0-7167-4773-4}}</ref>{{rp|391}} A random variable can take on a set of possible different values (similarly to other mathematical variables), each with an associated probability, in contrast to other mathematical variables.


A random variable's possible values might represent the possible outcomes of a yet-to-be-performed experiment, or the possible outcomes of a past experiment whose already-existing value is uncertain (for example, due to imprecise measurements). They may also conceptually represent either the results of an "objectively" random process (such as rolling a die) or the "subjective" randomness that results from incomplete knowledge of a quantity. The meaning of the probabilities assigned to the potential values of a random variable is not part of [[wikipedia:probability theory|probability theory]] itself but is instead related to philosophical arguments over the [[wikipedia:interpretation of probability|interpretation of probability]].  The mathematics works the same regardless of the particular interpretation in use.
A random variable's possible values might represent the possible outcomes of a yet-to-be-performed experiment, or the possible outcomes of a past experiment whose already-existing value is uncertain (for example, due to imprecise measurements). They may also conceptually represent either the results of an "objectively" random process (such as rolling a die) or the "subjective" randomness that results from incomplete knowledge of a quantity. The meaning of the probabilities assigned to the potential values of a random variable is not part of [[probability theory|probability theory]] itself but is instead related to philosophical arguments over the [[interpretation of probability|interpretation of probability]].  The mathematics works the same regardless of the particular interpretation in use.


The mathematical function describing the possible values of a random variable and their associated probabilities is known as a [[wikipedia:probability_distribution | probability distribution]]. ''Random variables'' can be [[#Discrete random variable|discrete]], that is, taking any of a specified finite or countable list of values, endowed with a [[wikipedia:Probability_Mass_Function|probability mass function]], characteristic of a [[wikipedia:probability_distribution | probability distribution ]]; or [[#Continuous random variable|continuous]], taking any numerical value in an interval or collection of intervals, via a [[wikipedia:probability_density_function|probability density function]] that is characteristic of a [[wikipedia:probability_distribution | probability distribution]]; or a mixture of both types.  The realizations of a random variable, that is, the results of randomly choosing values according to the variable's probability distribution function, are called [[wikipedia:random variate|random variate]]s.
The mathematical function describing the possible values of a random variable and their associated probabilities is known as a [[probability_distribution | probability distribution]]. ''Random variables'' can be [[#Discrete random variable|discrete]], that is, taking any of a specified finite or countable list of values, endowed with a [[Probability_Mass_Function|probability mass function]], characteristic of a [[probability_distribution | probability distribution ]]; or [[#Continuous random variable|continuous]], taking any numerical value in an interval or collection of intervals, via a [[probability_density_function|probability density function]] that is characteristic of a [[probability_distribution | probability distribution]]; or a mixture of both types.  The realizations of a random variable, that is, the results of randomly choosing values according to the variable's probability distribution function, are called [[random variate|random variate]]s.


The formal mathematical treatment of random variables is a topic in [[wikipedia:probability theory|probability theory]]. In that context, a random variable is understood as a [[wikipedia:Measurable function|function]] defined on a [[wikipedia:sample space|sample space]] whose outputs are numerical values.<ref name="UCSB">{{cite web | title = Economics 245A – Introduction to Measure Theory | url = http://econ.ucsb.edu/~doug/245a/Lectures/Measure%20Theory.pdf | last = Steigerwald  | first = Douglas G. | publisher = University of California, Santa Barbara | accessdate = April 26, 2013}}</ref>
The formal mathematical treatment of random variables is a topic in [[probability theory|probability theory]]. In that context, a random variable is understood as a [[Measurable function|function]] defined on a [[sample space|sample space]] whose outputs are numerical values.<ref name="UCSB">{{cite web | title = Economics 245A – Introduction to Measure Theory | url = http://econ.ucsb.edu/~doug/245a/Lectures/Measure%20Theory.pdf | last = Steigerwald  | first = Douglas G. | publisher = University of California, Santa Barbara | accessdate = April 26, 2013}}</ref>


==Definition==
==Definition==


A ''random variable'' <math>X\colon \Omega \to E</math> is a [[wikipedia:measurable function|measurable function]] from the set of possible [[wikipedia:outcome (probability)|outcome]]s <math> \Omega </math> to some set <math> E</math>.  The technical axiomatic definition requires <math>\Omega</math> to be a [[wikipedia:probability space|probability space]] and <math>E</math> to be a [[wikipedia:measurable space|measurable space]].
A ''random variable'' <math>X\colon \Omega \to E</math> is a [[measurable function|measurable function]] from the set of possible [[outcome (probability)|outcome]]s <math> \Omega </math> to some set <math> E</math>.  The technical axiomatic definition requires <math>\Omega</math> to be a [[probability space|probability space]] and <math>E</math> to be a [[measurable space|measurable space]].


Note that although <math>X</math> is usually a real-valued function (<math>E=\mathbb{R}</math>), it does ''not'' return a probability. Rather, <math>X</math> describes some numerical property that outcomes in <math>\Omega</math> may have. E.g. the number of heads in a random collection of coin flips; the height of a random person.  The probability that <math>X</math> is less than or equal to 3 is the measure of the set of outcomes <math>\{\omega\in\Omega: X(\omega)\leq 3\}</math>, denoted <math>\operatorname{P}(X\leq 3).</math>
Note that although <math>X</math> is usually a real-valued function (<math>E=\mathbb{R}</math>), it does ''not'' return a probability. Rather, <math>X</math> describes some numerical property that outcomes in <math>\Omega</math> may have. E.g. the number of heads in a random collection of coin flips; the height of a random person.  The probability that <math>X</math> is less than or equal to 3 is the measure of the set of outcomes <math>\{\omega\in\Omega: X(\omega)\leq 3\}</math>, denoted <math>\operatorname{P}(X\leq 3).</math>




{{alert-info|For all intents and purposes, the measurability of a random variable <math>X</math>&nbsp; basically means that the events
{{alert-info|The measurability of a random variable <math>X</math> basically means that the events


<math display="block">
<math display="block">
Line 22: Line 22:
</math>
</math>


have been assigned a probability (a likelihood of occurrence).  
have been assigned a probability.  


}}
}}
Line 28: Line 28:
===Standard case===
===Standard case===


Usually <math>E =</math> [[wikipedia:Real number|<math>\mathbb{R}</math>]]. When the [[wikipedia:Image (mathematics)|image]] (or range) of <math>X</math> is finite or [[wikipedia:countably infinite|countably infinite]], the random variable is called a discrete random variable<ref name="Yates" />{{rp|399}} and its distribution can be described by a [[wikipedia:probability_mass_function | probability mass function]] which assigns a probability to each value in the image of <math>X</math>. If the image is uncountably infinite then <math>X</math> is called a continuous random variable. In the special case that it is [[wikipedia:absolutely continuous|absolutely continuous]], its distribution can be described by a [[wikipedia:probability_density_function | probability density function ]], which assigns probabilities to intervals; in particular, each individual point must necessarily have probability zero for an absolutely continuous random variable. Not all continuous random variables are absolutely continuous,<ref>{{cite book|author1=L. Castañeda |author2=V. Arunachalam |author3=S. Dharmaraja  |last-author-amp=yes |title = Introduction to Probability and Stochastic Processes with Applications | year = 2012 | publisher= Wiley | page = 67 | url=https://books.google.com/books?id=zxXRn-Qmtk8C&pg=PA67 }}</ref> for example a [[wikipedia:mixture distribution|mixture distribution]]. Such random variables cannot be described by a probability density or a probability mass function.
Usually <math>E =</math> [[Real number|<math>\mathbb{R}</math>]]. When the [[Image (mathematics)|image]] (or range) of <math>X</math> is finite or [[countably infinite|countably infinite]], the random variable is called a discrete random variable<ref name="Yates" />{{rp|399}} and its distribution can be described by a [[probability_mass_function | probability mass function]] which assigns a probability to each value in the image of <math>X</math>. If the image is uncountably infinite then <math>X</math> is called a continuous random variable. In the special case that it is [[absolutely continuous|absolutely continuous]], its distribution can be described by a [[probability_density_function | probability density function ]], which assigns probabilities to intervals; in particular, each individual point must necessarily have probability zero for an absolutely continuous random variable. Not all continuous random variables are absolutely continuous,<ref>{{cite book|author1=L. Castañeda |author2=V. Arunachalam |author3=S. Dharmaraja  |name-list-style=amp |title = Introduction to Probability and Stochastic Processes with Applications | year = 2012 | publisher= Wiley | page = 67 | url=https://books.google.com/books?id=zxXRn-Qmtk8C&pg=PA67 |isbn=9781118344941 }}</ref> for example a [[mixture distribution|mixture distribution]]. Such random variables cannot be described by a probability density or a probability mass function.


===Almost sure equality===
===Almost sure equality===
Line 59: Line 59:
  </math>
  </math>


If the coin is a [[wikipedia:fair coin|fair coin]], ''Y'' has a [[wikipedia:Probability_Mass_Function|probability mass function]] <math>f_Y</math> given by:<math display="block">
If the coin is a [[fair coin|fair coin]], ''Y'' has a [[Probability_Mass_Function|probability mass function]] <math>f_Y</math> given by:<math display="block">
f_Y(y) =
f_Y(y) =
\begin{cases}
\begin{cases}
Line 71: Line 71:


A random variable can also be used to describe the process of rolling dice and the possible outcomes. The most obvious representation for the two-dice case is to take the set of pairs of numbers <math>n_1</math> and <math>n_2</math> from {1, 2, 3, 4, 5, 6} (representing the numbers on the two dice) as the sample space. The total number rolled (the sum of the numbers in each pair) is then a random variable <math>X</math> given by the function that maps the pair to the sum:<math display="block">X((n_1, n_2)) = n_1 + n_2</math>
A random variable can also be used to describe the process of rolling dice and the possible outcomes. The most obvious representation for the two-dice case is to take the set of pairs of numbers <math>n_1</math> and <math>n_2</math> from {1, 2, 3, 4, 5, 6} (representing the numbers on the two dice) as the sample space. The total number rolled (the sum of the numbers in each pair) is then a random variable <math>X</math> given by the function that maps the pair to the sum:<math display="block">X((n_1, n_2)) = n_1 + n_2</math>
and (if the dice are [[wikipedia:fair die|fair]]) has a probability mass function <math>f_X</math> given by:<math display="block">f_X(S) =  \tfrac{\min(S-1, 13-S)}{36}, \text{for } S \in \{2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12\}</math>
and (if the dice are [[fair die|fair]]) has a probability mass function <math>f_X</math> given by:<math display="block">f_X(S) =  \tfrac{\min(S-1, 13-S)}{36}, \text{for } S \in \{2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12\}</math>


===Continuous random variable===
===Continuous random variable===

Latest revision as of 15:28, 4 April 2024

A random variable, random quantity, aleatory variable or stochastic variable is a variable whose value is subject to variations due to chance (i.e. randomness, in a mathematical sense).[1]:391 A random variable can take on a set of possible different values (similarly to other mathematical variables), each with an associated probability, in contrast to other mathematical variables.

A random variable's possible values might represent the possible outcomes of a yet-to-be-performed experiment, or the possible outcomes of a past experiment whose already-existing value is uncertain (for example, due to imprecise measurements). They may also conceptually represent either the results of an "objectively" random process (such as rolling a die) or the "subjective" randomness that results from incomplete knowledge of a quantity. The meaning of the probabilities assigned to the potential values of a random variable is not part of probability theory itself but is instead related to philosophical arguments over the interpretation of probability. The mathematics works the same regardless of the particular interpretation in use.

The mathematical function describing the possible values of a random variable and their associated probabilities is known as a probability distribution. Random variables can be discrete, that is, taking any of a specified finite or countable list of values, endowed with a probability mass function, characteristic of a probability distribution ; or continuous, taking any numerical value in an interval or collection of intervals, via a probability density function that is characteristic of a probability distribution; or a mixture of both types. The realizations of a random variable, that is, the results of randomly choosing values according to the variable's probability distribution function, are called random variates.

The formal mathematical treatment of random variables is a topic in probability theory. In that context, a random variable is understood as a function defined on a sample space whose outputs are numerical values.[2]

Definition

A random variable [math]X\colon \Omega \to E[/math] is a measurable function from the set of possible outcomes [math] \Omega [/math] to some set [math] E[/math]. The technical axiomatic definition requires [math]\Omega[/math] to be a probability space and [math]E[/math] to be a measurable space.

Note that although [math]X[/math] is usually a real-valued function ([math]E=\mathbb{R}[/math]), it does not return a probability. Rather, [math]X[/math] describes some numerical property that outcomes in [math]\Omega[/math] may have. E.g. the number of heads in a random collection of coin flips; the height of a random person. The probability that [math]X[/math] is less than or equal to 3 is the measure of the set of outcomes [math]\{\omega\in\Omega: X(\omega)\leq 3\}[/math], denoted [math]\operatorname{P}(X\leq 3).[/math]


The measurability of a random variable [math]X[/math] basically means that the events

[[math]] X^{-1}(a,b] = \{\omega : a \lt X(\omega) \leq b\} [[/math]]

have been assigned a probability.

Standard case

Usually [math]E =[/math] [math]\mathbb{R}[/math]. When the image (or range) of [math]X[/math] is finite or countably infinite, the random variable is called a discrete random variable[1]:399 and its distribution can be described by a probability mass function which assigns a probability to each value in the image of [math]X[/math]. If the image is uncountably infinite then [math]X[/math] is called a continuous random variable. In the special case that it is absolutely continuous, its distribution can be described by a probability density function , which assigns probabilities to intervals; in particular, each individual point must necessarily have probability zero for an absolutely continuous random variable. Not all continuous random variables are absolutely continuous,[3] for example a mixture distribution. Such random variables cannot be described by a probability density or a probability mass function.

Almost sure equality

Two random variables [math]X[/math] and Y are equal almost surely if, and only if, the probability that they are different is zero:

[[math]]\operatorname{P}(X \neq Y) = 0.[[/math]]

Equality

Finally, the two random variables [math]X[/math] and Y are equal if they are equal as functions on their measurable space:

[[math]]X(\omega)=Y(\omega)\qquad\hbox{for all }\omega.[[/math]]

Examples

Discrete random variable

In an experiment a person may be chosen at random, and one random variable may be the person's height. Mathematically, the random variable is interpreted as a function which maps the person to the person's height. Associated with the random variable is a probability distribution that allows the computation of the probability that the height is in any subset of possible values, such as the probability that the height is between 180 and 190 cm, or the probability that the height is either less than 150 or more than 200 cm.

Another random variable may be the person's number of children; this is a discrete random variable with non-negative integer values. It allows the computation of probabilities for individual integer values – the probability mass function (PMF) – or for sets of values, including infinite sets. For example, the event of interest may be "an even number of children". For both finite and infinite event sets, their probabilities can be found by adding up the PMFs of the elements; that is, the probability of an even number of children is the infinite sum [math]\operatorname{PMF}(0) + \operatorname{PMF}(2) + \operatorname{PMF}(4) + \cdots[/math].

In examples such as these, the sample space (the set of all possible persons) is often suppressed, since it is mathematically hard to describe, and the possible values of the random variables are then treated as a sample space. But when two random variables are measured on the same sample space of outcomes, such as the height and number of children being computed on the same random persons, it is easier to track their relationship if it is acknowledged that both height and number of children come from the same random person, for example so that questions of whether such random variables are correlated or not can be posed.

Coin toss

The possible outcomes for one coin toss can be described by the sample space [math]\Omega = \{\text{heads}, \text{tails}\}[/math]. We can introduce a real-valued random variable [math]Y[/math] that models a $1 payoff for a successful bet on heads as follows:

[[math]] Y(\omega) = \begin{cases} 1, & \text{if} \ \ \omega = \text{heads} ,\\ \\ 0, & \text{if} \ \ \omega = \text{tails} . \end{cases} [[/math]]

If the coin is a fair coin, Y has a probability mass function [math]f_Y[/math] given by:

[[math]] f_Y(y) = \begin{cases} \tfrac 12,& \text{if }y=1,\\ \\ \tfrac 12,& \text{if }y=0,\\ \end{cases} [[/math]]

Dice roll

A random variable can also be used to describe the process of rolling dice and the possible outcomes. The most obvious representation for the two-dice case is to take the set of pairs of numbers [math]n_1[/math] and [math]n_2[/math] from {1, 2, 3, 4, 5, 6} (representing the numbers on the two dice) as the sample space. The total number rolled (the sum of the numbers in each pair) is then a random variable [math]X[/math] given by the function that maps the pair to the sum:

[[math]]X((n_1, n_2)) = n_1 + n_2[[/math]]

and (if the dice are fair) has a probability mass function [math]f_X[/math] given by:

[[math]]f_X(S) = \tfrac{\min(S-1, 13-S)}{36}, \text{for } S \in \{2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12\}[[/math]]

Continuous random variable

An example of a continuous random variable would be one based on a spinner that can choose a horizontal direction. Then the values taken by the random variable are directions. We could represent these directions by North, West, East, South, Southeast, etc. However, it is commonly more convenient to map the sample space to a random variable which takes values which are real numbers. This can be done, for example, by mapping a direction to a bearing in degrees clockwise from North. The random variable then takes values which are real numbers from the interval [0, 360), with all parts of the range being "equally likely". In this case, [math]X[/math] = the angle spun. Any real number has probability zero of being selected, but a positive probability can be assigned to any range of values. For example, the probability of choosing a number in [0, 180] is 1/2. Instead of speaking of a probability mass function, we say that the probability density of [math]X[/math] is 1/360. The probability of a subset of [0, 360) can be calculated by multiplying the measure of the set by 1/360. In general, the probability of a set for a given continuous random variable can be calculated by integrating the density over the given set.

Mixed type

An example of a random variable of mixed type would be based on an experiment where a coin is flipped and the spinner is spun only if the result of the coin toss is heads. If the result is tails, [math]X[/math] = −1; otherwise [math]X[/math] = the value of the spinner as in the preceding example. There is a probability of 1/2 that this random variable will have the value −1. Other ranges of values would have half the probability of the last example.

Notes

  1. 1.0 1.1 Yates, Daniel S.; Moore, David S; Starnes, Daren S. (2003). The Practice of Statistics (2nd ed.). New York: Freeman. ISBN 978-0-7167-4773-4.
  2. Steigerwald, Douglas G. "Economics 245A – Introduction to Measure Theory" (PDF). University of California, Santa Barbara. Retrieved April 26, 2013.
  3. L. Castañeda; V. Arunachalam & S. Dharmaraja (2012). Introduction to Probability and Stochastic Processes with Applications. Wiley. p. 67. ISBN 9781118344941.

References

  • Wikipedia contributors. "Random variable". Wikipedia. Wikipedia. Retrieved 28 January 2022.