Content deleted Content added
Scientific29 (talk | contribs) →General: Link directly to the section of interest |
Scientific29 (talk | contribs) →General: Rewrote intro section to more explicitly and concisely describe the mode |
||
Line 4:
==General==
In nonlinear regression, a [statistical model](https://en.wikipedia.org/wiki/Statistical_model) of the form,
:<math> y ~ f(\mathbf{x}, \mathbf{\beta})</math>
relates a vector of [[independent variables]], '''x''', and their associated observed dependent variables, '''y'''. The function ''f'' is nonlinear in the components of the vector of parameters ''β'', but otherwise arbitrary. For example, the [[Michaelis–Menten]] model for enzyme kinetics has two parameters and one independent variable, related by ''f'' by:{{efn|This model can also be expressed in the conventional biological notation:
:<math> v = \frac{V_\max\ [\mbox{S}]}{K_m + [\mbox{S}]} </math>
}}
:<math> f(x,\boldsymbol\beta)= \frac{\beta_1 x}{\beta_2 + x} </math>
[[Systematic error]] may be present in the independent variables but its treatment is outside the scope of regression analysis. If the independent variables are not error-free, this is an [[errors-in-variables model]], also outside this scope.
Other examples of nonlinear functions include [[exponential function]]s, [[Logarithmic growth|logarithmic functions]], [[trigonometric functions]], [[Exponentiation|power functions]], [[Gaussian function]], and [[Lorenz curve]]s. Some functions, such as the exponential or logarithmic functions, can be transformed so that they are linear. When so transformed, standard linear regression can be performed but must be applied with caution. See [[#Transformation|Linearization§Transformation]], below, for more details.
|