Probability mass function: Difference between revisions

Content deleted Content added
Formal definition: Grammatical change. Fixed noun verb agreement.
Tags: Mobile edit Mobile web edit
m Examples: List unification (+ :-indentation discouraged)
Line 58:
There are three major distributions associated, the [[Bernoulli distribution]], the [[Binomial distribution]] and the [[geometric distribution]].
 
*[[Bernoulli distribution]], Ber(p), is used to model an experiment with only two possible outcomes. The two outcomes are often encoded as 1 and 0. <math display="block">p_X(x) = \begin{cases}
p, & \text{if }x\text{ is 1} \\
 
:<math>p_X(x) = \begin{cases} p, & \text{if }x\text{ is 1} \\ 1-p, & \text{if }x\text{ is 0} \end{cases}</math>
:\end{cases}</math> An example of the Bernoulli distribution is tossing a coin. Suppose that <math>S</math> is the sample space of all outcomes of a single toss of a fair coin, and <math>X</math> is the random variable defined on <math>S</math> assigning 0 to the category "tails" and 1 to the category "heads". Since the coin is fair, the probability mass function is <math display="block">p_X(x) = \begin{cases}
::<math>p_X(x) = \begin{cases}\frac{1}{2}, &x \in \{0, 1\},\\
0, &x \notin \{0, 1\}.
\end{cases}</math>
*<p>[[Binomial distribution]], Bin(n,p), models the number of successes when someone draws n times with replacement. Each draw or experiment is independent, with two possible outcomes. The associated probability mass function is<math>\binom{n}{k}p^k (1-p)^{n-k}</math>. [[Image:Fair dice probability distribution.svg|right|thumb|The probability mass function of a [[Dice|fair die]]. All the numbers on the {{dice}} have an equal chance of appearing on top when the die stops rolling.]]</p><!--
 
:--><p>An example of the Binomial distribution is the probability of getting exactly one 6 when someone rolls a fair die three times. </p>
*[[Binomial distribution]], Bin(n,p), models the number of successes when someone draws n times with replacement. Each draw or experiment is independent, with two possible outcomes. The associated probability mass function is<math>\binom{n}{k}p^k (1-p)^{n-k}</math>. [[Image:Fair dice probability distribution.svg|right|thumb|The probability mass function of a [[Dice|fair die]]. All the numbers on the {{dice}} have an equal chance of appearing on top when the die stops rolling.]]
*<p>Geometric distribution describes the number of trials needed to get one success, denoted as Geo(p). Its probability mass function is <math>p_X(k) = (1-p)^{k-1} p</math>. </p><!--
 
:--><p>An example is tossing the coin until the first head appears. </p>
:An example of the Binomial distribution is the probability of getting exactly one 6 when someone rolls a fair die three times.
 
 
 
*Geometric distribution describes the number of trials needed to get one success, denoted as Geo(p). Its probability mass function is <math>p_X(k) = (1-p)^{k-1} p</math>.
 
:An example is tossing the coin until the first head appears.
::
 
Other distributions that can be modeled using a probability mass function are the [[Categorical distribution]] (also known as the generalized Bernoulli distribution) and the [[multinomial distribution]].
Line 79 ⟶ 74:
* If the discrete distribution has two or more categories one of which may occur, whether or not these categories have a natural ordering, when there is only a single trial (draw) this is a categorical distribution.
* An example of a [[Joint probability distribution|multivariate discrete distribution]], and of its probability mass function, is provided by the [[multinomial distribution]]. Here the multiple random variables are the numbers of successes in each of the categories after a given number of trials, and each non-zero probability mass gives the probability of a certain combination of numbers of successes in the various categories.
 
{{clear}}
 
===Infinite===
 
*The following exponentially declining distribution is an example of a distribution with an infinite number of possible outcomes—all the positive integers: <math display="block">\text{Pr}(X=i)= \frac{1}{2^i}\quad \text{for}\quad i=1, 2, 3, \dots .</math> Despite the infinite number of possible outcomes, the total probability mass is 1/2 + 1/4 + 1/8 + ... = 1, satisfying the unit total probability requirement for a probability distribution.
 
::<math>\text{Pr}(X=i)= \frac{1}{2^i}\quad \text{for}\quad i=1, 2, 3, \dots .</math>
 
:Despite the infinite number of possible outcomes, the total probability mass is 1/2 + 1/4 + 1/8 + ... = 1, satisfying the unit total probability requirement for a probability distribution.
 
==Multivariate case==