Content deleted Content added
cosmetic stuff, and some more explanation of the duality gap / lower bounds |
link to phil wolfe |
||
Line 1:
The '''Frank–Wolfe algorithm''' is a simple [[iterative method|iterative]] [[First-order approximation|first-order]] [[Mathematical optimization|optimization]] [[algorithm]] for [[constrained optimization|constrained]] [[convex optimization]]. Also known as the '''conditional gradient method''',<ref>{{Cite doi|10.1016/0041-5553(66)90114-5|noedit}}</ref> '''reduced gradient algorithm''' and the '''convex combination algorithm''', the method was originally proposed by [[Marguerite Frank]] and [[Philip Wolfe (mathematician)|Philip Wolfe]] in 1956.<ref>{{cite doi|10.1002/nav.3800030109|noedit}}</ref> In each iteration, the Frank–Wolfe algorithm considers a [[linear approximation]] of the objective function, and moves slightly towards a minimizer of this linear function (taken over the same ___domain).
==Problem statement==
|