Content deleted Content added
→Convergence results: It doesn't always converge with constant step size! E.g. try to minimize abs(x) starting from 0.5 with step 1: it oscillates. |
→Convergence results: Reference on constant step-size subgradient method (NOT algorithm) |
||
Line 27:
===Convergence results===
For constant step
:<math>\lim_{k\to\infty} f_{\rm{best}}^{(k)} - f^* <\epsilon
The approximate convergence of the constant step-size (scaled) subgradient method is stated as Exercise 6.3.14(a) in Bertsekas (page 636): {{cite book
| last = Bertsekas
| first = Dimitri P.
| authorlink = Dimitri P. Bertsekas
| title = Nonlinear Programming
| edition = Second
| publisher = Athena Scientific
| date = 1999
| ___location = Cambridge, MA.
| isbn = 1-886529-00-0
}} On page 636, Bertsekas attributes this result to Schor: {{cite book
| last = Shor
| first = Naum Z.
| authorlink = Naum Z. Shor
| title = Minimization Methods for Non-differentiable Functions
| publisher = [[Springer-Verlag]]
| isbn = 0-387-12763-1
| date = 1985
}}
</ref>
==Constrained optimization==
|