Content deleted Content added
m missing ) |
created inline reference section |
||
Line 1:
In [[statistics]], an '''additive model''' ('''AM''') is a [[nonparametric regression]] method. It was suggested by Jerome H. Friedman and Werner Stuetzle (1981)<ref>Friedman, J.H. and Stuetzle, W. (1981). "Projection Pursuit Regression", ''Journal of the American Statistical Association'' 76:817–823</ref> and is an essential part of the [[Alternating conditional expectation model|ACE]] algorithm. The ''AM'' uses a one dimensional [[Smoothing|smoother]] to build a restricted class of nonparametric regression models. Because of this, it is less affected by the [[curse of dimensionality]] than e.g. a ''p''-dimensional smoother. Furthermore, the ''AM'' is more flexible than a [[linear regression|standard linear model]], while being more interpretable than a general regression surface at the cost of approximation errors. Problems with ''AM'' include [[model selection]], [[overfitting]], and [[multicollinearity]].
==Description==
Line 6:
or
: <math>Y= \beta_0+\sum_{j=1}^p f_j(X_{j})+\varepsilon </math>
Where <math>E[ \epsilon ] = 0</math>, <math>Var(\epsilon) = \sigma^2</math> and <math>E[ f_j(X_{j}) ] = 0</math>. The functions <math>f_j(x_{ij})</math> are unknown [[Smooth function|smooth functions]] fit from the data. Fitting the ''AM'' (i.e. the functions <math>f_j(x_{ij})</math>) can be done using the [[backfitting algorithm]] proposed by Andreas Buja, [[Trevor Hastie]] and [[Robert Tibshirani]] (1989).<ref>Buja, A., Hastie, T., and Tibshirani, R. (1989). "Linear Smoothers and Additive Models", ''The Annals of Statistics'' 17(2):453–555.</ref>
==See also==
Line 16:
==References==
{{reflist}}
==Further Reading==
*Breiman, L. and Friedman, J.H. (1985). "Estimating Optimal Transformations for Multiple Regression and Correlation", ''[[Journal of the American Statistical Association]]'' 80:580–598.
[[Category:Regression analysis]]
|