Probability mass function

This is an old revision of this page, as edited by Bjcairns (talk | contribs) at 04:31, 5 December 2003 (The old version was too confusing and more than a little abusive of notation.). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In probability theory, a probability mass function (abbreviated pmf) gives the probability that a discrete random variable is exactly equal to some value. The probability mass function differs from the probability density function in that the latter, defined only for continuous random variables, does not describe an actual probability but rather a rate of change in the cumulative distribution function.

Mathematical description

Suppose that X is a discrete random variable, taking values on some countable sample space  SR. Then the probability mass function  fX(x)  for X is given by

 

Note that this explicitly defines  fX(x)  for all real numbers, including all values in R that X could never take; indeed, it assigns such values a probability of zero. (Alternatively, think of  Pr(X = x)  as 0 when  xR\S.)

The discontinuity of probability mass functions reflects the fact that the cumulative distribution function of a discrete random variable is also discontinuous. Where it is differentiable (i.e. where xR\S) the derivative is zero, just as the probability mass function is zero at all such points.

Examples

A simple example of a probability mass function is the following. Suppose that X is the outcome of a single coin toss, assigning 0 to tails and 1 to heads. The probability that X = x is just 0.5 on the state space {0, 1} (this is a Bernoulli random variable), and hence the probability mass function is

 

Probability mass functions may also be defined for any discrete random variable, including constant, binomial (including Bernoulli), negative binomial, Poisson, geometric and hypergeometric random variables.