Gradient method: Difference between revisions

Content deleted Content added
new article
No edit summary
 
(13 intermediate revisions by 8 users not shown)
Line 1:
In [[optimization (mathematics)|optimization]], a '''Gradientgradient method''' is an [[algorithm]] to solve problems of the form
 
:<math>\min_{x\in\mathbb R^n}\; f(x)</math>
Line 6:
 
==See also==
{{div col-begin|colwidth=22em}}
{{col-break}}
 
* [[Gradient descent]]
* [[ConjugateStochastic gradient descent]]
* [[Coordinate descent]]
* [[Frank–Wolfe algorithm]]
* [[Landweber iteration]]
* [[Random coordinate descent]]
* [[Conjugate gradient method]]
* [[Derivation of the conjugate gradient method]]
* [[Nonlinear conjugate gradient method]]
* [[Biconjugate gradient method]]
* [[Biconjugate gradient stabilized method]]
{{div col-break end}}
 
==References==
* {{cite book | year=1997 | title=Optimization : Algorithms and Consistent Approximations
| publisher=Springer-Verlag | isbn=0-387-94971-2 |author=Elijah Polak}}
 
{{Optimization algorithms}}
 
{{DEFAULTSORT:Gradient DescentMethod}}
[[Category:First order methods]]
[[Category:Optimization algorithms and methods]]
[[Category:GradientNumerical methodslinear algebra]]
[[Category:Gradient methods| ]]
 
{{linear-algebra-stub}}
[[fr:Algorithme du gradient]]
[[ja:勾配法]]