Subgradient method: Difference between revisions

Content deleted Content added
ShinuK (talk | contribs)
m External links: One of the links was broken. Also, updated description to be more accurate.
No edit summary
Line 3:
Subgradient methods are slower than Newton's method when applied to minimize twice continuously differentiable convex functions. However, Newton's method fails to converge on problems that have non-differentiable kinks.
 
In recent years, some [[interior-point methods]] have been suggested for convex minimization problems, but subgradient projection methods and related bundle methods of descent remain competitive. For convex minimization problems with enormomous dimensions, subgradient-projection methods are suitable, because of they require little storage.
 
Subgradient projection methods are often applied to large-scale problems with decomposition techniques. Such decomposition methods often allow a simple distributed method for a problem.