Local regression: Difference between revisions

Content deleted Content added
m clean up, typo(s) fixed: locally- → locally , paramter → parameter, 1970's → 1970s (3), interchangably → interchangeably, sqaures → squares, eg, → e.g.,, gresion → gression, esssion → ession
Line 12:
The trade-off for these features is increased computation. Because it is so computationally intensive, LOESS would have been practically impossible to use in the era when least squares regression was being developed. Most other modern methods for process modeling are similar to LOESS in this respect. These methods have been consciously designed to use our current computational ability to the fullest possible advantage to achieve goals not easily achieved by traditional approaches.
 
A smooth curve through a set of data points obtained with this statistical technique is called a '''loess curve''', particularly when each smoothed value is given by a weighted quadratic least squares regression over the span of values of the ''y''-axis [[scattergram]] criterion variable. When each smoothed value is given by a weighted linear least squares regression over the span, this is known as a '''lowess curve'''; however, some authorities treat '''lowess''' and loess as synonyms.<ref>Kristen Pavlik, US Environmental Protection Agency, ''[https://19january2021snapshot.epa.gov/sites/static/files/2016-07/documents/loess-lowess.pdf Loess (or Lowess)]'', '''Nutrient Steps''', July 2016.</ref><ref name="NIST"/>
 
==History==
Line 174:
When <math>f(y,\theta(x))</math> is the normal distribution and <math>\theta(x)</math> is the mean function, the local likelihood method reduces to the standard local least-squares regression. For other likelihood families, there is (usually) no closed-form solution for the local likelihood estimate, and iterative procedures such as [[iteratively reweighted least squares]] must be used to compute the estimate.
 
'''Example''' (local logistic regression). All response observations are 0 or 1, and the mean function is the "success" probability, <math>\mu(x_i) = \Pr (Y_i=1 | x_i)</math>. Since <math>\mu(x_i)</math> must be between 0 and 1, a local polynomial model should not be used for <math>\mu(x)</math> directly. Insead, the logistic transformation
<math display="block">
\theta(x) = \log \left ( \frac{\mu(x)}{1-\mu(x)} \right )
Line 252:
*[https://stat.ethz.ch/R-manual/R-devel/library/stats/html/loess.html R: Local Polynomial Regression Fitting] The Loess function in [[R (programming language)|R]]
*[https://stat.ethz.ch/R-manual/R-devel/library/stats/html/lowess.html R: Scatter Plot Smoothing] The Lowess function in [[R (programming language)|R]]
*[https://stat.ethz.ch/R-manual/R-devel/library/stats/html/supsmu.html The supsmu function (Friedman's SuperSmoother) in R]
*[http://www.r-statistics.com/2010/04/quantile-loess-combining-a-moving-quantile-window-with-loess-r-function/ Quantile LOESS] – A method to perform Local regression on a '''Quantile''' moving window (with R code)
*[http://fivethirtyeight.blogs.nytimes.com/2013/03/26/how-opinion-on-same-sex-marriage-is-changing-and-what-it-means/?hp Nate Silver, How Opinion on Same-Sex Marriage Is Changing, and What It Means] – sample of LOESS versus linear regression