Content deleted Content added
numerous small corrections |
|||
Line 14:
[[File:Probability tree diagram.svg|thumb|On a [[Tree diagram (probability theory)|tree diagram]], branch probabilities are conditional on the event associated with the parent node. (Here, the overbars indicate that the event does not occur.)]]
[[File:Venn Pie Chart describing Bayes' law.png|thumb|Venn
=== Conditioning on an event ===
Line 53:
The case of greatest interest is that of a random variable {{mvar|Y}}, conditioned on a continuous random variable {{mvar|X}} resulting in a particular outcome {{mvar|x}}. The event <math>B = \{ X = x \}</math> has probability zero and, as such, cannot be conditioned on.
Instead of conditioning on {{mvar|X}} being ''exactly'' {{mvar|x}}, we could condition on it being closer than distance <math>\
We can then take the [[limit (mathematics)|limit]]
{{NumBlk|::|<math>\lim_{\
For example, if two continuous random variables {{mvar|X}} and {{mvar|Y}} have a joint density <math>f_{X,Y}(x,y)</math>, then by [[L'Hôpital's rule]] and [[Leibniz integral rule]], upon differentiation with respect to <math>\
:<math>
\begin{aligned}
\lim_{\
\lim_{\
&= \frac{\int_U f_{X, Y}(x_0, y) \, \mathrm{d}y}{\int_\mathbb{R} f_{X, Y}(x_0, y) \, \mathrm{d}y}.
\end{aligned}
</math>
Line 68:
It is tempting to ''define'' the undefined probability <math>P(A \mid X=x)</math> using limit ({{EquationNote|1}}), but this cannot be done in a consistent manner. In particular, it is possible to find random variables {{mvar|X}} and {{mvar|W}} and values {{mvar|x}}, {{mvar|w}} such that the events <math>\{X = x\}</math> and <math>\{W = w\}</math> are identical but the resulting limits are not:
:<math>\lim_{\
The [[Borel–Kolmogorov paradox]] demonstrates this with a geometrical argument.
Line 254:
=== Example ===
When [[Morse code]] is transmitted, there is a certain probability that the "dot" or "dash" that was received is erroneous. This is often taken as interference in the transmission of a message. Therefore, it is important to consider when sending a "dot", for example, the probability that a "dot" was received. This is represented by: <math>P(\text{dot sent } | \text{ dot received}) = P(\text{dot received } | \text{ dot sent}) \frac{P(\text{dot sent})}{P(\text{dot received})}.</math> In Morse code, the ratio of dots to dashes is 3:4 at the point of sending, so the
: <math>P(\text{dot received}) = P(\text{dot received
: <math>P(\text{dot received}) = P(\text{dot received
: <math>P(\text{dot received}) = \frac{9}{10}\times\frac{3}{7} + \frac{1}{10}\times\frac{4}{7} = \frac{31}{70}</math>
Now, <math>P(\text{dot sent
: <math>P(\text{dot sent
== Statistical independence ==
Line 283:
is also equivalent. Although the derived forms may seem more intuitive, they are not the preferred definition as the conditional probabilities may be undefined, and the preferred definition is symmetrical in ''A'' and ''B''. Independence does not refer to a disjoint event.<ref>{{Cite book|last=Tijms|first=Henk|url=https://www.cambridge.org/core/books/understanding-probability/B82E701FAAD2C0C2CF36E05CFC0FF3F2|title=Understanding Probability|date=2012|publisher=Cambridge University Press|isbn=978-1-107-65856-1|edition=3|___location=Cambridge|doi=10.1017/cbo9781139206990}}</ref>
It should also be noted that given the independent event pair [''A
: <math>P(AB \mid C) = P(A \mid C)P(B \mid C).</math>
This theorem
'''Independent events vs. mutually exclusive events'''
Line 317:
=== Assuming conditional probability is of similar size to its inverse ===
{{Main|Confusion of the inverse}}
[[File:Bayes theorem visualisation.svg|thumb|450x450px|A geometric visualization of Bayes' theorem. In the table, the values 2, 3, 6 and 9 give the relative weights of each corresponding condition and case. The figures denote the cells of the table involved in each metric, the probability being the fraction of each figure that is shaded. This shows that <math>P(A|B) P(B) = P(B
In general, it cannot be assumed that ''P''(''A''|''B'') ≈ ''P''(''B''|''A''). This can be an insidious error, even for those who are highly conversant with statistics.<ref>Paulos, J.A. (1988) ''Innumeracy: Mathematical Illiteracy and its Consequences'', Hill and Wang. {{ISBN|0-8090-7447-8}} (p. 63 ''et seq.'')</ref> The relationship between ''P''(''A''|''B'') and ''P''(''B''|''A'') is given by [[Bayes' theorem]]:
:<math>\begin{align}
|