Content deleted Content added
No edit summary |
Undid revision 482504246 by 24.91.39.79 (talk) Flatness is another issue. Schnabel and other's tensor methods are supposed to deal with singularities. Restore "flat regions" w source :) |
||
Line 1:
'''Subgradient methods''' are [[iterative method]]s for solving [[convex optimization|convex minimization]] problems. Originally developed by [[Naum Z. Shor]] and others in the 1960s and 1970s, subgradient methods are convergent when applied even to a non-differentiable objective function.
Subgradient methods are slower than Newton's method when applied to minimize twice continuously differentiable convex functions. However, Newton's method fails to converge on problems that have non-differentiable kinks.
In recent years, some [[interior-point methods]] have been suggested for convex minimization problems, but subgradient projection methods and related bundle methods of descent remain competitive. For convex minimization problems with very large number of dimensions, subgradient-projection methods are suitable, because they require little storage.
|