Content deleted Content added
No edit summary |
|||
Line 14:
:<math>\alpha_k\geq0,\qquad\sum_{k=1}^\infty \alpha_k^2 < \infty,\qquad \sum_{k=1}^\infty \alpha_k = \infty</math>.
*Nonsummable diminishing, i.e. any step sizes satisfying
:<math>\alpha_k\geq0,\qquad \lim_{k\to\infty} \alpha_k = 0,\qquad
*Nonsummable diminishing step lengths, i.e. <math>\alpha_k = \gamma_k/\lVert g^{(k)} \rVert_2</math>, where
:<math>\gamma_k\geq0,\qquad \lim_{k\to\infty} \gamma_k = 0,\qquad
Notice that the step sizes listed above are determined before the algorithm is run and do not depend on any data computed during the algorithm. This is very different from the step size rules found in standard descent methods, which depend on the current point and search direction.
===Convergence results===
For constant step size and constant step length, the subgradient algorithm is guaranteed to converge to within some range of the optimal value, i.e.
|