Nonlinear regression: Difference between revisions

Content deleted Content added
Regression statistics: clean up, typo(s) fixed: Therefore → Therefore, using AWB
General: Link directly to the section of interest
Line 16:
where <math>\beta_1</math> is the parameter <math>V_\max</math>, <math>\beta_2</math> is the parameter <math>K_m</math> and [''S''] is the independent variable, ''x''. This function is nonlinear because it cannot be expressed as a [[linear combination]] of the two ''<math>\beta</math>''s.
 
Other examples of nonlinear functions include [[exponential function]]s, [[Logarithmic growth|logarithmic functions]], [[trigonometric functions]], [[Exponentiation|power functions]], [[Gaussian function]], and [[Lorenz curve]]s. Some functions, such as the exponential or logarithmic functions, can be transformed so that they are linear. When so transformed, standard linear regression can be performed but must be applied with caution. See [[#LinearizationTransformation|Linearization§Transformation]], below, for more details.
 
In general, there is no closed-form expression for the best-fitting parameters, as there is in [[linear regression]]. Usually numerical [[Optimization (mathematics)|optimization]] algorithms are applied to determine the best-fitting parameters. Again in contrast to linear regression, there may be many [[local maximum|local minima]] of the function to be optimized and even the global minimum may produce a [[Bias of an estimator|biased]] estimate. In practice, [[guess value|estimated values]] of the parameters are used, in conjunction with the optimization algorithm, to attempt to find the global minimum of a sum of squares.