Content deleted Content added
application to neural networks |
Citation bot (talk | contribs) Alter: template type. | Use this bot. Report bugs. | Suggested by AManWithNoPlan | #UCB_webform 402/850 |
||
Line 1:
{{short description|Algorithm used to solve non-linear least squares problems}}
In [[mathematics]] and computing, the '''Levenberg–Marquardt algorithm''' ('''LMA''' or just '''LM'''), also known as the '''damped least-squares''' ('''DLS''') method, is used to solve [[non-linear least squares]] problems. These minimization problems arise especially in [[least squares]] [[curve fitting]]. Applied to [[neural network|artificial neural network]] training, a Levenberg-Marquardt algorithm often converges faster than first-order [[backpropagation]] methods.<ref>{{cite
The LMA is used in many software applications for solving generic curve-fitting problems. However, as with many fitting algorithms, the LMA finds only a [[local minimum]], which is not necessarily the [[global minimum]]. The LMA interpolates between the [[Gauss–Newton algorithm]] (GNA) and the method of [[gradient descent]]. The LMA is more [[Robustness (computer science)|robust]] than the GNA, which means that in many cases it finds a solution even if it starts very far off the final minimum. For well-behaved functions and reasonable starting parameters, the LMA tends to be slower than the GNA. LMA can also be viewed as [[Gauss–Newton]] using a [[trust region]] approach.
|