Content deleted Content added
m Reverted 1 edit by 2A14:B85:9641:9E1:7975:DDF0:AD23:C858 (talk) to last revision by PanHaspadar |
|||
(25 intermediate revisions by 19 users not shown) | |||
Line 1:
{{short description|Probability distribution of the sum of random variables}}
The '''convolution/sum of probability distributions''' arises in [[probability theory]] and [[statistics]] as the operation in terms of [[probability distribution]]s that corresponds to the addition of [[statistically independent|independent]] [[random variable]]s and, by extension, to forming linear combinations of random variables. The operation here is a special case of [[convolution]] in the context of probability distributions.
==Introduction==
The [[probability distribution]] of the sum of two or more [[independent (probability)|independent]] [[random variable]]s is the convolution of their individual distributions. The term is motivated by the fact that the [[probability mass function]] or [[probability density function]] of a sum of independent random variables is the [[convolution]] of their corresponding probability mass functions or probability density functions respectively. Many well known distributions have simple convolutions: see [[List of convolutions of probability distributions]].
The general formula for the distribution of the sum <math>Z=X+Y</math> of two independent integer-valued (and hence discrete) random variables is
Statistics 116. Stanford. https://web.archive.org/web/20210413200454/http://statweb.stanford.edu/~susan/courses/s116/node114.html</ref>
:<math>P(Z=z) = \sum_{k=-\infty}^\infty P(X=k)P(Y=z-k)</math>
For independent, continuous random variables with [[probability density function]]s (PDF) <math>f,g</math> and [[cumulative distribution function]]s (CDF) <math>F,G</math> respectively, we have that the CDF of the sum is:
:<math>
If we start with random variables <math>X</math> and <math>Y</math>, related by <math>Z = X + Y</math>, and with no information about their possible independence, then:
:<math>f_Z(z) = \int \limits_{-\infty}^{\infty} f_{XY}(x, z-x)~dx</math>
However, if <math>X</math> and <math>Y</math> are independent, then:
:<math>f_{XY}(x,y) = f_X(x) f_Y(y)</math>
and this formula becomes the convolution of probability distributions:
:<math>f_Z(z) = \int \limits_{-\infty}^{\infty} f_{X}(x)~f_Y(z-x)~dx</math>
== Example derivation ==
Line 21 ⟶ 34:
=== Convolution of Bernoulli distributions ===
The convolution of two independent identically distributed [[Bernoulli distribution|Bernoulli random variables]] is a
:<math> \sum_{i=1}^2 \mathrm{Bernoulli}(p) \sim \mathrm{Binomial}(2,p)</math>
|