Levenberg–Marquardt algorithm: Difference between revisions

Content deleted Content added
Frau Holle (talk | contribs)
Frau Holle (talk | contribs)
Line 21:
== The solution ==
 
Like other numeric minimization algorithms, the Levenberg-Marquardt algorithm is an [[iteration|iterative]] procedure. To start a minimization, the user has to provide an initial guess for the parameter vector '''p'''. In many cases, an uninformed standard guess like '''p'''<sup>T</sup>=(1,1,...,1) will work fine; in other cases, the algorithm converges only if the initial guess is already somewhat close to the final solution.
The LMA solves the minimization problem according to
 
In each iteration step, the parameter vector '''p''' is replaced by a new estimate '''p'''+'''q'''. To determine '''q''', the functions ''f''<sub>''i''</sub>('''p'''+'''q''') are approximated by their linearizations
:<math>(J^{T}J + \lambda I)\vec{q} = -J^{T}\vec{f}</math>.
:'''f'''</sub>('''p'''+'''q''') &approx; '''f'''<sub>''i''</sub>('''p''') + '''J'''<sup>T</sup>'''q'''
where '''J''' is the [[Jacobian]] of '''f''' at '''p'''.
 
The sum of squares ''S'' becomes minimal if &nabla;<sub>'''q'''</sub>=0. With the above linearization, this leads to the following equation
Here <math>J</math> represents the [[Jacobian]] of the function <math>\vec{f}</math>, <math>\lambda</math> the damping factor, which is altered at every iteration, <math>I</math> the [[identity matrix]], and <math>\vec{q}</math> the solution to an [[iteration]] step.
:('''J'''<sup>T</sup>'''J''')'''q''' = -'''J'''<sup>T</sup>'''f'''
from which '''q''' can be obtained by inverting '''J'''<sup>T</sup>'''J'''.
:('''J'''<sup>T</sup>'''J''' + &lambda;)'''q''' = -'''J'''<sup>T</sup>'''f'''.
The damping factor &lambda; is redetermined at every iteration.
 
== Weblinks ==