In probability theory, a probability mass function (abbreviated pmf) gives the probability that a discrete random variable is exactly equal to some value. The probability mass function differs from the probability density function in that the latter, defined only for continuous random variables, does not describe an actual probability but rather a rate of change in the cumulative distribution function.
Mathematical description
Suppose that X is a discrete random variable, that is, that it takes values on some countable sample space S. We may assume that S ⊂ R (this will suffice, but more accurately X is a function, X: S→R). Then the probability mass function fX(x) for X is given by
Note that this explicitly defines fX(x) for all real numbers, including all values in R that X could never take; indeed, it assigns such values a probability of zero. (Alternatively, think of Pr(X = x) as 0 when x ∈ R\S.)
The discontinuity of probability mass functions reflects the fact that the cumulative distribution function of a discrete random variable is also discontinuous. Where it is differentiable (i.e. where x ∈ R\S) the derivative is zero, just as the probability mass function is zero at all such points.
Examples
A simple example of a probability mass function is the following. Suppose that X is the outcome of a single coin toss, assigning 0 to tails and 1 to heads. The probability that X = x is just 0.5 on the state space {0, 1} (this is a Bernoulli random variable), and hence the probability mass function is
Probability mass functions may also be defined for any discrete random variable, including constant, binomial (including Bernoulli), negative binomial, Poisson, geometric and hypergeometric random variables.