Content deleted Content added
→References: references |
m →Convergence results: link Dimitri P. Bertsekas and Shor |
||
Line 29:
For constant step-length and scaled subgradients having [[Euclidean norm]] equal to one, the subgradient method converges to an arbitrarily close approximation to the minimum value, that is
:<math>\lim_{k\to\infty} f_{\rm{best}}^{(k)} - f^* <\epsilon</math> by a result of Shor.<ref>
The approximate convergence of the constant step-size (scaled) subgradient method is stated as Exercise 6.3.14(a) in [[Dimitri P. Bertsekas|Bertsekas]] (page 636): {{cite book
| last = Bertsekas
| first = Dimitri P.
Line 39:
| ___location = Cambridge, MA.
| isbn = 1-886529-00-0
}} On page 636, Bertsekas attributes this result to [[Naum Z. Shor|Shor]]: {{cite book
| last = Shor
| first = Naum Z.
|