Content deleted Content added
Adding short description: "Concept in convex optimization mathematics" |
|||
Line 1:
{{Short description|Concept in convex optimization mathematics}}
'''Subgradient methods''' are [[convex optimization]] methods which use [[Subderivative|subderivatives]]. Originally developed by [[Naum Z. Shor]] and others in the 1960s and 1970s, subgradient methods are convergent when applied even to a non-differentiable objective function. When the objective function is differentiable, sub-gradient methods for unconstrained problems use the same search direction as the method of [[gradient descent|steepest descent]].
|