Content deleted Content added
add missing methods |
No edit summary |
||
(4 intermediate revisions by 4 users not shown) | |||
Line 1:
In [[optimization (mathematics)|optimization]], a '''gradient method''' is an [[algorithm]] to solve problems of the form
:<math>\min_{x\in\mathbb R^n}\; f(x)</math>
with the search directions defined by the [[gradient]] of the function at the current point. Examples of gradient
==See also==
{{div col
{{col-break}}▼
▲* [[Gradient descent method]]
▲* [[Conjugate gradient method]]
* [[Coordinate descent]]
* [[Derivation of the conjugate gradient method]]▼
* [[Frank–Wolfe algorithm]]
* [[Landweber iteration]]
* [[Random coordinate descent]]▼
* [[Conjugate gradient method]]
▲* [[Derivation of the conjugate gradient method]]
* [[Nonlinear conjugate gradient method]]
* [[Biconjugate gradient method]]
* [[Biconjugate gradient stabilized method]]
▲* [[Random coordinate descent]]
==References==
* {{cite book | year=1997 | title=Optimization : Algorithms and Consistent Approximations
| publisher=Springer-Verlag | isbn=0-387-94971-2 |author=Elijah Polak}}
{{Optimization algorithms}}
Line 32 ⟶ 31:
[[Category:Gradient methods| ]]
{{linear-algebra-stub}}
|