Content deleted Content added
m added stanford refs |
added links to names and ISBNs to books |
||
Line 1:
'''Subgradient methods''' are [[algorithm|algorithms]] for solving [[convex optimization]] problems. Originally developed by [[Naum Z. Shor]] and others in the 1960s and 1970s, subgradient methods can be used with a non-differentiable objective function. When the objective function is differentiable, subgradient methods for unconstrained problems use the same search direction as the method of [[gradient descent|steepest descent]].
Although subgradient methods can be much slower than [[interior-point methods]] and [[Newton's method in optimization|Newton's method]] in practice, they can be immediately applied to a far wider variety of problems and require much less memory. Moreover, by combining the subgradient method with primal or dual decomposition techniques, it is sometimes possible to develop a simple distributed algorithm for a problem.
Line 66:
* {{cite book
| last = Bertsekas
| first =
| authorlink = Dimitri P. Bertsekas
| title = Nonlinear Programming
| publisher = Athena Scientific
| date = 1999
| ___location = Cambridge, MA.
| isbn = 1-886529-00-0
}}
* {{cite book
| last = Shor
| first =
| authorlink = Naum Z. Shor
| title = Minimization Methods for Non-differentiable Functions
| publisher = [[Springer-Verlag]]
| isbn = 0-387-12763-1
| date = 1985
}}
|