Content deleted Content added
→Conditioning on an event of probability zero: Add equation number and reference |
|||
Line 67:
Instead of conditioning on {{mvar|X}} being ''exactly'' {{mvar|x}}, we could condition on it being closer than distance <math>\epsilon</math> away from {{mvar|x}}. The event <math>B = \{ x-\epsilon < X < x+\epsilon \}</math> will generally have nonzero probability and hence, can be conditioned on.
We can then take the [[limit (mathematics)|limit]]
{{NumBlk|::|<math>\lim_{\epsilon \to 0} P(A \mid x-\epsilon < X < x+\epsilon).</math>|{{EquationRef|1}}}}
For example, if two continuous random variables {{mvar|X}} and {{mvar|Y}} have a joint density <math>f_{X,Y}(x,y)</math>, then by [[L'Hôpital's rule]] and [[Leibniz integral rule]], upon differentiation with respect to <math>\epsilon</math>:
Line 79:
The resulting limit is the [[conditional probability distribution]] of {{mvar|Y}} given {{mvar|X}} and exists when the denominator, the probability density <math>f_X(x_0)</math>, is strictly positive.
It is tempting to ''define'' the undefined probability <math>P(A \mid X=x)</math> using
:<math>\lim_{\epsilon \to 0} P(A \mid x-\epsilon \le X \le x+\epsilon) \neq \lim_{\epsilon \to 0} P(A \mid w-\epsilon \le W \le w+\epsilon).</math>
The [[Borel–Kolmogorov paradox]] demonstrates this with a geometrical argument.
|