Content deleted Content added
ok, i thought i recreated it with redirect but somehow it isn't showing up.... here is another try. Just a redirect to generalized additive model for now. |
No edit summary |
||
(35 intermediate revisions by 26 users not shown) | |||
Line 1:
{{Short description|Statistical regression model}}
#REDIRECT [[Generalized additive model]]▼
{{About|the statistical method|additive color models|Additive color}}
In [[statistics]], an '''additive model''' ('''AM''') is a [[nonparametric regression]] method. It was suggested by [[Jerome H. Friedman]] and Werner Stuetzle (1981)<ref>[[Friedman, J.H.]] and Stuetzle, W. (1981). "Projection Pursuit Regression", ''Journal of the American Statistical Association'' 76:817–823. {{doi|10.1080/01621459.1981.10477729}}</ref> and is an essential part of the [[Alternating conditional expectations|ACE]] algorithm. The ''AM'' uses a one-dimensional [[Smoothing|smoother]] to build a restricted class of nonparametric regression models. Because of this, it is less affected by the [[curse of dimensionality]] than a ''p''-dimensional smoother. Furthermore, the ''AM'' is more flexible than a [[linear regression|standard linear model]], while being more interpretable than a general regression surface at the cost of approximation errors. Problems with ''AM'', like many other machine-learning methods, include [[model selection]], [[overfitting]], and [[multicollinearity]].
==Description==
Given a [[data]] set <math>\{y_i,\, x_{i1}, \ldots, x_{ip}\}_{i=1}^n</math> of ''n'' [[statistical unit]]s, where <math>\{x_{i1}, \ldots, x_{ip}\}_{i=1}^n</math> represent predictors and <math>y_i</math> is the outcome, the ''additive model'' takes the form
: <math>\mathrm{E}[y_i|x_{i1}, \ldots, x_{ip}] = \beta_0+\sum_{j=1}^p f_j(x_{ij}) </math>
or
: <math>Y= \beta_0+\sum_{j=1}^p f_j(X_{j})+\varepsilon </math>
Where <math>\mathrm{E}[ \epsilon ] = 0</math>, <math>\mathrm{Var}(\epsilon) = \sigma^2</math> and <math>\mathrm{E}[ f_j(X_{j}) ] = 0</math>. The functions <math>f_j(x_{ij})</math> are unknown [[smooth function]]s fit from the data. Fitting the ''AM'' (i.e. the functions <math>f_j(x_{ij})</math>) can be done using the [[backfitting algorithm]] proposed by Andreas Buja, [[Trevor Hastie]] and [[Robert Tibshirani]] (1989).<ref>Buja, A., Hastie, T., and Tibshirani, R. (1989). "Linear Smoothers and Additive Models", ''The Annals of Statistics'' 17(2):453–555. {{JSTOR|2241560}}</ref>
==See also==
*[[Backfitting algorithm]]
*[[Projection pursuit regression]]
*[[Generalized additive model for ___location, scale, and shape]] (GAMLSS)
*[[Median polish]]
*[[Projection pursuit]]
==References==
{{reflist}}
==Further reading==
*Breiman, L. and [[Friedman, J.H.]] (1985). "Estimating Optimal Transformations for Multiple Regression and Correlation", ''[[Journal of the American Statistical Association]]'' 80:580–598. {{doi|10.1080/01621459.1985.10478157}}
[[Category:Nonparametric regression]]
[[Category:Regression models]]
|