Content deleted Content added
Tags: Mobile edit Mobile app edit Android app edit App section source |
Tags: Mobile edit Mobile app edit Android app edit App select source |
||
Line 137:
As described above, local regression uses a locally weighted least squares criterion to estimate the regression parameters. This inherits many of the advantages (ease of implementation and interpretation; good properties when errors are normally distributed) and disadvantages (sensitivity to extreme values and outliers; inefficiency when errors have unequal variance or are not normally distributed) usually associated with least squares regression.
These disadvantages can be addressed by replacing the local least-squares estimation by something else. Two such ideas are presented here:
====Local
In local likelihood estimation, developed in Tibshirani and Hastie (1987),<ref name="tib-hast-lle" /> the observations <math>Y_i</math> are assumed to come from a parametric family of distributions, with a known probability density function (or mass function, for discrete data),
Line 176:
An asymptotic theory for local likelihood estimation is developed in J. Fan, [[Nancy E. Heckman]] and M.P.Wand (1995);<ref>{{cite Q|Q132508409}}</ref> the book Loader (1999)<ref>{{cite Q|Q59410587}}</ref> discusses many more applications of local likelihood.
====Robust
To address the sensitivity to outliers, techniques from [[robust regression]] can be employed. In local [[M-estimator|M-estimation]], the local least-squares criterion is replaced by a criterion of the form
|