Content deleted Content added
Marcocapelle (talk | contribs) removed Category:Mathematical optimization; added Category:Optimization algorithms and methods using HotCat |
No edit summary |
||
Line 1:
'''Subgradient methods''' are [[iterative method]]s for solving [[convex optimization|convex minimization]] problems. Originally developed by [[Naum Z. Shor]] and others in the 1960s and 1970s, subgradient methods are convergent when applied even to a non-differentiable objective function. When the objective function is differentiable,
Subgradient methods are slower than Newton's method when applied to minimize twice continuously differentiable convex functions. However, Newton's method fails to converge on problems that have non-differentiable kinks.
|