Content deleted Content added
m =Equivalence of random variables= |
=Equivalence of random variables= distance between random variables |
||
Line 44:
=== Equivalence of random variables ===
There are saveral different senses in which
Two random variables ''X'' and ''Y'' are ''equal in distribution'' if▼
:<math>P[X\le x]=P[Y<x]\quad\hbox{for all}\quad x.</math>▼
To be equal indistribution, random variables need not be defined on the same probability space, but wothout loss of generality they can be made into independent random variables on the same probability space.▼
In incresing order of strenght, the precise definition of these notions of equivalence is:
▲Two random variables ''X'' and ''Y'' are ''equal in distribution'' if
▲To be equal
:<math>d(X,Y)=sup_x|P[X\le x]-P[Y\le y]|,</math>
which is the basis of the [[Kolmogorov-Smirnov test]].
Two random variables ''X'' and ''Y'' are ''equal in p-th mean'' if the ''p''th moment of |''X''-''Y''| is zero, that is,
:<math>E[|X-Y|^p]=0.</math>
Equality in ''p''th mean implies equality in ''q''th mean for all ''q''<''p''. As in the previous case, there is a related distance between the random variables, namely
:<math>d_p(X,Y)=E[|X-Y|^p].</math>
Two random variables ''X'' and ''Y'' are ''equal almost surely'' if, and only if, the probability that they are different is zero:
:<math>P[X\neq Y]=0</math>
For all practical purposes in probability theory, this notion of equivalence is as strong as
:<math>d_\infty(X,Y)={\rm esssup}_\omega|X(\omega)-Y(\omega)|.</math>
(see [[essential supremum]])
Finally, two random variables ''X'' and ''Y'' are ''equal'' if they are equal as functions on their probability space, that is,
|