Subgradient method: Difference between revisions

Content deleted Content added
Created page with ''''Subgradient methods''' are algorithms for solving convex optimization problems that can be used with a non-differentiable objective function or...'
 
No edit summary
Line 21:
For constant step size and constant step length, the subgradient algorithm is guaranteed to converge to within some range of the optimal value, i.e.
:<math>\lim_{k\to\infty} f_{\rm{best}}^{(k)} - f^* <\epsilon</math>
 
[[Category: Mathematical stubs]]