Subgradient method: Difference between revisions

Content deleted Content added
Open access status updates in citations with OAbot #oabot
updated outdated term
Tags: Mobile edit Mobile app edit iOS app edit App section source
 
Line 1:
{{Short description|Concept in convex optimization mathematics}}
'''Subgradient methods''' are [[convex optimization]] methods which use [[Subderivative|subderivatives]]. Originally developed by [[Naum Z. Shor]] and others in the 1960s and 1970s, subgradient methods are convergent when applied even to a non-differentiable objective function. When the objective function is differentiable, sub-gradient methods for unconstrained problems use the same search direction as the method of [[gradient descent|steepestgradient descent]].
 
Subgradient methods are slower than Newton's method when applied to minimize twice continuously differentiable convex functions. However, Newton's method fails to converge on problems that have non-differentiable kinks.