Content deleted Content added
→The solution: mo' lambda, mo' easy |
copyedit |
||
Line 4:
The algorithm was first published in 1944 by [[Kenneth Levenberg]],<ref name="Levenberg"/> while working at the [[Frankford Arsenal|Frankford Army Arsenal]]. It was rediscovered in 1963 by [[Donald Marquardt]],<ref name="Marquardt"/> who worked as a [[statistician]] at [[DuPont]], and independently by Girard,<ref name="Girard"/> Wynne<ref name="Wynne"/> and Morrison.<ref name="Morrison"/>
The LMA is used in many software applications for solving generic curve-fitting problems. By using the
▲The LMA is used in many software applications for solving generic curve-fitting problems. By using the Gauss-Newton algorithm it often converges faster than first-order methods.<ref>{{cite journal|title=Improved Computation for Levenberg–Marquardt Training|last1=Wiliamowski|first1=Bogdan|last2=Yu|first2=Hao|journal=IEEE Transactions on Neural Networks and Learning Systems|volume=21|issue=6|date=June 2010|url=https://www.eng.auburn.edu/~wilambm/pap/2010/Improved%20Computation%20for%20LM%20Training.pdf}}</ref> However, like other iterative optimization algorithms, the LMA finds only a [[local minimum]], which is not necessarily the [[global minimum]].
== The problem ==
Line 37 ⟶ 36:
where <math>\mathbf J</math> is the [[Jacobian matrix and determinant|Jacobian matrix]], whose {{tmath|i}}-th row equals <math>\mathbf J_i</math>, and where <math>\mathbf f\left (\boldsymbol\beta\right )</math> and <math>\mathbf y</math> are vectors with {{tmath|i}}-th component
<math>f\left (x_i, \boldsymbol\beta\right )</math> and <math>y_i</math> respectively. The above expression obtained for {{tmath|\boldsymbol\beta}} comes under the
Levenberg's contribution is to replace this equation by a "damped version":
|