Additive model: Difference between revisions

Content deleted Content added
EVA003 (talk | contribs)
mNo edit summary
No edit summary
 
(5 intermediate revisions by 3 users not shown)
Line 1:
{{Short description|Statistical regression model}}
In [[statistics]], an '''additive model''' ('''AM''') is a [[nonparametric regression]] method. It was suggested by [[Jerome H. Friedman]] and Werner Stuetzle (1981)<ref>[[Friedman, J.H.]] and Stuetzle, W. (1981). "Projection Pursuit Regression", ''Journal of the American Statistical Association'' 76:817&ndash;823. {{doi|10.1080/01621459.1981.10477729}}</ref> and is an essential part of the [[Alternating conditional expectations|ACE]] algorithm. The ''AM'' uses a one-dimensional [[Smoothing|smoother]] to build a restricted class of nonparametric regression models. Because of this, it is less affected by the [[curse of dimensionality]] than e.g. a ''p''-dimensional smoother. Furthermore, the ''AM'' is more flexible than a [[linear regression|standard linear model]], while being more interpretable than a general regression surface at the cost of approximation errors. Problems with ''AM'', like many other machine learning methods, include [[model selection]], [[overfitting]], and [[multicollinearity]].
{{About|the statistical method|additive color models|Additive color}}
In [[statistics]], an '''additive model''' ('''AM''') is a [[nonparametric regression]] method. It was suggested by [[Jerome H. Friedman]] and Werner Stuetzle (1981)<ref>[[Friedman, J.H.]] and Stuetzle, W. (1981). "Projection Pursuit Regression", ''Journal of the American Statistical Association'' 76:817&ndash;823. {{doi|10.1080/01621459.1981.10477729}}</ref> and is an essential part of the [[Alternating conditional expectations|ACE]] algorithm. The ''AM'' uses a one-dimensional [[Smoothing|smoother]] to build a restricted class of nonparametric regression models. Because of this, it is less affected by the [[curse of dimensionality]] than e.g. a ''p''-dimensional smoother. Furthermore, the ''AM'' is more flexible than a [[linear regression|standard linear model]], while being more interpretable than a general regression surface at the cost of approximation errors. Problems with ''AM'', like many other machine -learning methods, include [[model selection]], [[overfitting]], and [[multicollinearity]].
 
==Description==
Line 6 ⟶ 8:
or
: <math>Y= \beta_0+\sum_{j=1}^p f_j(X_{j})+\varepsilon </math>
Where <math>\mathrm{E}[ \epsilon ] = 0</math>, <math>\mathrm{Var}(\epsilon) = \sigma^2</math> and <math>\mathrm{E}[ f_j(X_{j}) ] = 0</math>. The functions <math>f_j(x_{ij})</math> are unknown [[smooth function]]s fit from the data. Fitting the ''AM'' (i.e. the functions <math>f_j(x_{ij})</math>) can be done using the [[backfitting algorithm]] proposed by Andreas Buja, [[Trevor Hastie]] and [[Robert Tibshirani]] (1989).<ref>Buja, A., Hastie, T., and Tibshirani, R. (1989). "Linear Smoothers and Additive Models", ''The Annals of Statistics'' 17(2):453&ndash;555. {{jstorJSTOR|2241560}}</ref>
 
==See also==
Line 14 ⟶ 16:
*[[Generalized additive model for ___location, scale, and shape]] (GAMLSS)
*[[Median polish]]
*[[Projection Pursuitpursuit]]
 
==References==