Content deleted Content added
created inline reference section |
No edit summary |
||
(20 intermediate revisions by 15 users not shown) | |||
Line 1:
{{Short description|Statistical regression model}}
In [[statistics]], an '''additive model''' ('''AM''') is a [[nonparametric regression]] method. It was suggested by Jerome H. Friedman and Werner Stuetzle (1981)<ref>Friedman, J.H. and Stuetzle, W. (1981). "Projection Pursuit Regression", ''Journal of the American Statistical Association'' 76:817–823</ref> and is an essential part of the [[Alternating conditional expectation model|ACE]] algorithm. The ''AM'' uses a one dimensional [[Smoothing|smoother]] to build a restricted class of nonparametric regression models. Because of this, it is less affected by the [[curse of dimensionality]] than e.g. a ''p''-dimensional smoother. Furthermore, the ''AM'' is more flexible than a [[linear regression|standard linear model]], while being more interpretable than a general regression surface at the cost of approximation errors. Problems with ''AM'' include [[model selection]], [[overfitting]], and [[multicollinearity]].▼
{{About|the statistical method|additive color models|Additive color}}
▲In [[statistics]], an '''additive model''' ('''AM''') is a [[nonparametric regression]] method. It was suggested by [[Jerome H. Friedman]] and Werner Stuetzle (1981)<ref>[[Friedman, J.H.]] and Stuetzle, W. (1981). "Projection Pursuit Regression", ''Journal of the American Statistical Association'' 76:817–823. {{doi|10.1080/01621459.1981.10477729}}</ref> and is an essential part of the [[Alternating conditional
==Description==
Given a [[data]] set <math>\{y_i,\, x_{i1}, \ldots, x_{ip}\}_{i=1}^n</math> of ''n'' [[statistical unit]]s, where <math>\{x_{i1}, \ldots, x_{ip}\}_{i=1}^n</math> represent predictors and <math>y_i</math> is the outcome, the ''additive model'' takes the form
: <math>\mathrm{E}[y_i|x_{i1}, \ldots, x_{ip}] = \beta_0+\sum_{j=1}^p f_j(x_{ij}) </math>
or
: <math>Y= \beta_0+\sum_{j=1}^p f_j(X_{j})+\varepsilon </math>
Where <math>\mathrm{E}[ \epsilon ] = 0</math>, <math>\mathrm{Var}(\epsilon) = \sigma^2</math> and <math>\mathrm{E}[ f_j(X_{j}) ] = 0</math>. The functions <math>f_j(x_{ij})</math> are unknown [[
==See also==
*[[Generalized additive model]]
*[[Backfitting algorithm]]
*[[Projection pursuit regression]]
*[[Generalized additive model for ___location, scale, and shape]] (GAMLSS)
*[[Median polish]]
*[[Projection pursuit]]
==References==
{{reflist}}
==Further
*Breiman, L. and [[Friedman, J.H.]] (1985). "Estimating Optimal Transformations for Multiple Regression and Correlation", ''[[Journal of the American Statistical Association]]'' 80:580–598. {{doi|10.1080/01621459.1985.10478157}}
[[Category:Regression analysis]]▼
[[Category:Nonparametric regression]]
|