Random variable: Difference between revisions

Content deleted Content added
fixed typo, linked randomness
No edit summary
Line 14:
=== Functions of random variables ===
 
If we have a random variable ''X'' on '''R''',&Omega; and a [[measurable function]] ''f'':'''R'''->'''R''', then naturally ''Y''=''f''(''X'') will also be a random variable on '''R'''&Omega;, since the composition of measurable functions is measurable. The same procedure that allowed one to go from a probability space (&Omega;,P) to ('''R''',dF<sub>''X''</sub>) can be used to obtain the probability distribution of ''Y''.
The cumulative distribution function of ''Y'' is
 
:F<sub>''Y''</sub>(''y'') = Prob(''f''(''X'')&le;y).
The function ''f'' is a measurable function from the probability space ('''R''',dF<sub>X</sub>); in other words, ''Y''=''f''(''X'') is a random variable. The cumulative distribution function of ''Y'' is
:F<sub>''Y''</sub>(''y'')=Prob(''f''(''X'')&le;y).
 
==== Example ====
 
Let ''f''(''x'')=''x''<sup>2</sup>. Then,
:F<sub>''Y''</sub>(''y'') = Prob(''X''<sup>2</sup>&le;y).
 
If ''y''<0, then Prob(''X''^<sup>2</sup>&le;''y'')=0, so
:F<sub>''Y''</sub>(''y'') = 0 if ''y''<0.
 
If ''y''=''a''<sup>2</sup>&ge;0, then Prob(''X''<sup>2</sup>&le;''y'')=Prob(|''X''|&le;&radic;''ay'')=Prob(-&radic;''ay''&le;''X''&le;&radic;''ay''), so
:F<sub>''Y''</sub>(''y'') = F<sub>''X''</sub>(&radic;''ay'')-F<sub>''X''</sub>(-&radic;''ay'') if ''y''&ge;0.
 
=== Moments ===
 
A random variable is often characterised by a small number of quantities, which also have a practical interpretation. For example, it is often enough to know what its "average value" is. This is captured by the mathematical concept of [[expected value]] of a random variable, denoted E[''X'']. Note that in general, E[''f''(''X'')] is '''not''' the same as ''f''(E[''X'']). Once the "average value" is known, one could then ask how far from thethis average value the values of ''X'' typically are, a question that is answered by the [[variance]] and [[standard deviation]] of a random variable.
 
Mathematically, this is known as the (generalised) [[problem of moments]]: for a given class of random variables ''X'', find a collection {''f<sub>i</sub>''} of functions such that the expectation values E[''f<sub>i</sub>''(''X'')] fully characterize the distribution of the random variable ''X''.