Content deleted Content added
Line 11:
Let <math>f:\mathbb{R}^n \to \mathbb{R}</math> be a [[convex function]] with ___domain <math>\mathbb{R}^n</math>. A classical subgradient method iterates
:<math>x^{(k+1)} = x^{(k)} - \alpha_k g^{(k)} \ </math>
where <math>g^{(k)}</math> denotes
It may happen that <math>-g^{(k)}</math> is not a descent direction for <math>f \ </math> at <math>x^{(k)}</math>. We therefore maintain a list <math>f_{\rm{best}} \ </math> that keeps track of the lowest objective function value found so far, i.e.
:<math>f_{\rm{best}}^{(k)} = \min\{f_{\rm{best}}^{(k-1)} , f(x^{(k)}) \}.</math>
|