Gradient method: Difference between revisions

Content deleted Content added
Redirected page to Gradient descent
 
new article
Line 1:
In [[optimization (mathematics)|optimization]], '''Gradient method''' is an [[algorithm]] to solve problems of the form
#redirect [[Gradient descent]]
 
:<math>\min_{x\in\mathbb R^n}\; f(x)</math>
 
with the search directions defined by the [[gradient]] of the function at the current point. Examples of gradient methods are the [[gradient descent]] and the [[conjugate gradient]].
 
==See also==
{{col-begin}}
{{col-break}}
 
#redirect* [[Gradient descent]]
* [[Conjugate gradient]]
 
==References==
* {{cite book | year=1997 | title=Optimization : Algorithms and Consistent Approximations
| publisher=Springer-Verlag | isbn=0-387-94971-2 |author=Elijah Polak}}
 
{{Optimization algorithms}}
 
{{DEFAULTSORT:Gradient Descent}}
[[Category:First order methods]]
[[Category:Optimization methods]]
[[Category:Gradient methods]]
 
[[fr:Algorithme du gradient]]
[[ja:勾配法]]