Subgradient method: Difference between revisions

Content deleted Content added
m Adding category Category:Convex optimization (using HotCat)
Convergence results: It doesn't always converge with constant step size! E.g. try to minimize abs(x) starting from 0.5 with step 1: it oscillates.
Line 27:
===Convergence results===
 
For constant step size and constant step length, the subgradient algorithm is guaranteed to converge to within some range of the optimal value, i.e.,{{cn}}
:<math>\lim_{k\to\infty} f_{\rm{best}}^{(k)} - f^* <\epsilon.</math>