Local regression: Difference between revisions

Content deleted Content added
Zaqrfv (talk | contribs)
Localized subsets of data; Bandwidth: small fixes. still TBC.
Zaqrfv (talk | contribs)
m fixing a citation
Line 26:
An important extension of local regression is Local Likelihood Estimation, formulated by [[Robert Tibshirani]] and [[Trevor Hastie]] (1987).<ref>{{citeQ|Q132187702}}</ref> This replaces the local least-squares criterion with a likelihood-based criterion, thereby extending the local regresion method to the [[Generalized linear model]] setting; for example binary data; count data; censored data.
 
Practical implementations of local regression began appearing in statistical software in the 1980's. Cleveland (1981)<ref>{{citeQ|Q29541549}}</ref> introduces the LOWESS routines, intended for smoothing scatterplots. This implements local linear fitting with a single predictor variable, and also introduces robustness downweighting to make the procedure resistant to outliers. An entirely new implementation, LOESS, is described in Cleveland and [[Susan J. Devlin]] (1988)<ref name="clevedev">{{citeQ|Q29393395}}</ref>. LOESS is a multivariate smoother, able to handle spatial data with two (or more) predictor variables, and uses (by default) local quadratic fitting. Both LOWESS and LOESS are implemented in the [[S (programming language)|S]] and [[R (programming language)|R]] programming languages. See also Cleveland's Local Fitting Software.<ref>{{cite web |last=Cleveland|first=William|title=Local Fitting Software|url=https://web.archive.org/web/20050912090738/http://www.stat.purdue.edu/~wsc/localfitsoft.html}}</ref>
 
While Local Regression, LOWESS and LOESS are sometimes used interchangably, this usage should be considered incorrect. Local Regression is a general term for the fitting procedure; LOWESS and LOESS are two distinct implementations.