Local regression: Difference between revisions

Content deleted Content added
Zaqrfv (talk | contribs)
Start "further reading" section
Zaqrfv (talk | contribs)
m fix cites
Line 34:
The [[Savitzky-Golay filter]], introduced by [[Abraham Savitzky]] and [[Marcel J. E. Golay]] (1964)<ref>{{cite Q|Q56769732}}</ref> significantly expanded the method. Like the earlier graduation work, their focus was data with an equally-spaced predictor variable, where (excluding boundary effects) local regression can be represented as a [[convolution]]. Savitzky and Golay published extensive sets of convolution coefficients for different orders of polynomial and smoothing window widths.
 
Local regression methods started to appear extensively in statistics literature in the 1970s; for example, [[Charles Joel Stone|Charles J. Stone]] (1977),<ref>{{cite Q|Q56533608}}</ref> [[Vladimir Katkovnik]] (1979)<ref>{{citation |first=Vladimir|last=Katkovnik|title=Linear and nonlinear methods of nonparametric regression analysis|journal=Soviet Automatic Control|date=1979|volume=12|issue=5|pages=25–34}}</ref> and [[William S. Cleveland]] (1979).<ref name="cleve79">{{cite Q|Q30052922}}</ref> Katkovnik (1985)<ref name="katbook">{{cite QciteQ|Q132129931}}</ref> is the earliest book devoted primarily to local regression methods.
 
Theoretical work continued to appear throughout the 1990s. Important contributions include [[Jianqing Fan]] and [[Irène Gijbels]] (1992)<ref>{{cite Q|Q132202273}}</ref> studying efficiency properties, and [[David Ruppert]] and [[Matthew P. Wand]] (1994)<ref>{{cite Q|Q132202598}}</ref> developing an asymptotic distribution theory for multivariate local regression.
Line 189:
</math>
 
An asymptotic theory for local likelihood estimation is developed in J. Fan, [[Nancy E. Heckman]] and M.P.Wand (1995);<ref>{{cite Q|Q132508409}}</ref> the book Loader (1999)<ref name="loabook">{{cite QciteQ|Q59410587}}</ref> discusses many more applications of local likelihood.
 
====Robust local regression====
Line 199:
\right )
</math>
where <math>\rho(\cdot)</math> is a robustness function and <math>s</math> is a scale parameter. Discussion of the merits of different choices of robustness function is best left to the [[robust regression]] literature. The scale parameter <math>s</math> must also be estimated. References for local M-estimation include Katkovnik (1985)<ref name="katbook">{{cite QciteQ|Q132129931}}</ref> and [[Alexandre Tsybakov]] (1986).<ref>{{citation |first=Alexandre B.|last=Tsybakov|title=Robust reconstruction of functions by the local-approximation method.|journal=Problems of Information Transmission|volume=22|pages=133–146}}</ref>
 
The robustness iterations in LOWESS and LOESS correspond to the robustness function defined by