Content deleted Content added
m fix citation |
m v2.05b - Bot T20 CW#61 - Fix errors for CW project (Reference before punctuation) |
||
Line 16:
==History==
Local regression and closely related procedures have a long and rich history, having been discovered and rediscovered in different fields on multiple occasions. An early work by [[Robert Henderson (mathematician)|Robert Henderson]]<ref>Henderson, R. Note on Graduation by Adjusted Average. Actuarial Society of America Transactions 17, 43--48. [https://archive.org/details/transactions17actuuoft archive.org]</ref> studying the problem of graduation (a term for smoothing used in Actuarial literature) introduced local regression using cubic polynomials, and showed how earlier graduation methods could be interpreted as local polynomial fitting. [[William S. Cleveland]] and [[Catherine Loader]] (1995);<ref>{{citeQ|Q132138257}}</ref>
The [[Savitzky-Golay filter]], introduced by [[Abraham Savitzky]] and [[Marcel J. E. Golay]] (1964)<ref>{{citeQ|Q56769732}}</ref> significantly expanded the method. Like the earlier graduation work, the focus was on data with an equally-spaced predictor variable, where (excluding boundary effects) local regression can be represented as a [[convolution]]. Savitzky and Golay published extensive sets of convolution coefficients for different orders of polynomial and smoothing window widths.
Local regression methods started to appear extensively in statistics literature in the 1970's; for example, [[Charles Joel Stone|Charles J. Stone]] (1977),<ref>{{citeQ|Q56533608}}</ref>
Extensive theoretical work continued to appear throughout the 1990's. Important contributions include [[Jianqing Fan]] and [[Irène Gijbels]] (1992)<ref>{{citeQ|Q132202273}}</ref> studying efficiency properties, and [[David Ruppert]] and [[Matthew P. Wand]] (1994)<ref>{{citeQ|Q132202598}}</ref> developing an asymptotic distribution theory for multivariate local regression.
Line 26:
An important extension of local regression is Local Likelihood Estimation, formulated by [[Robert Tibshirani]] and [[Trevor Hastie]] (1987).<ref name="tib-hast-lle">{{citeQ|Q132187702}}</ref> This replaces the local least-squares criterion with a likelihood-based criterion, thereby extending the local regresion method to the [[Generalized linear model]] setting; for example binary data; count data; censored data.
Practical implementations of local regression began appearing in statistical software in the 1980's. Cleveland (1981)<ref>{{citeQ|Q29541549}}</ref> introduces the LOWESS routines, intended for smoothing scatterplots. This implements local linear fitting with a single predictor variable, and also introduces robustness downweighting to make the procedure resistant to outliers. An entirely new implementation, LOESS, is described in Cleveland and [[Susan J. Devlin]] (1988).<ref name="clevedev">{{citeQ|Q29393395}}</ref>
While Local Regression, LOWESS and LOESS are sometimes used interchangably, this usage should be considered incorrect. Local Regression is a general term for the fitting procedure; LOWESS and LOESS are two distinct implementations.
Line 157:
====Local Likelihood Estimation====
In local likelihood estimation, developed in Tibshirani and Hastie (1987),<ref name="tib-hast-lle" />
<math display="block">
Y_i \sim f(y,\theta(x_i)),
|