Variance function

This is an old revision of this page, as edited by TeresaUCDavis (talk | contribs) at 03:50, 1 March 2014 (Derivation). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.


Variance Function new article content ...

In statistics, the variance function is a function relating the variance of a random quantity to the conditional mean of the random quantity. The variance function is a main ingredient in the generalized linear model framework and plays roles in Non-parametric regression and Functional data analysis as well. Not to be confused with the variance of a function, in parametric modelling, variance functions explicitly describe the relationship between the variance and the conditional mean of a random variable. For many well known distributions, the variance function represents the complete variance of a random variable under that distribution, but in fact, these are just special cases.

Intuition

Overview

Types

The variance function and it's applications comes up in many areas of statistical analysis. A very important use of this function is in the framework of Generalized linear models and Nonparametric regression.

Generalized Linear Model

Derivation

The Generalized linear model, GLM, is a generalization of ordinary regression analysis that extends to any member of the exponential family. It is particularly useful when the response variable is categorical, binary or subject to a constraint (e.g. only positive responses make sense). A quick summary of the components of a GLM are summarized on this page, but for more details and information see the page on generalized linear models.

Any random variable   in the exponential family has a probability density function of the form,

 

with loglikelihood,

 

Here,   is the canonical parameter and the parameter of interest, and   is a nuisance parameter which plays a role in the variance. We use the Bartlett's Identities insert reference to derive a general expression for the variance function. The first and second Bartlett results ensures that under suitable conditions ( insert references), for a density function dependent on &theta,  ,


 


 

These identities lead to simple calculations of the expected value and variance of any random variable   in the exponential family  . Expected Value of y Taking the first derivative with respect to   of the log of the density in the exponential family form described above, we have

 

Then taking the expected value and setting it equal to zero leads to,

 


 

Variance of y To compute the variance we use the second Bartlett identity,

 


 


 

</math>

Examples

Normal
Binomial
Poisson
Gamma

Application

Maximum Likelihood Estimation
Quasi Likelihood

Non-Parametric Regression Analysis

See Also

References

Generalized linear models Quasi Likelihood Non-Parametric Regression