Content deleted Content added
Added link to KKT conditions |
Citation bot (talk | contribs) Removed parameters. | Use this bot. Report bugs. | #UCB_CommandLine |
||
Line 107:
==Algorithms==
Unconstrained convex optimization can be easily solved with [[gradient descent]] (a special case of [[Method of steepest descent|steepest descent]]) or [[Newton's method in optimization|Newton's method]], combined with [[line search]] for an appropriate step size; these can be mathematically proven to converge quickly, especially the latter method.<ref name=":2">{{Cite book|last1=Boyd|first1=Stephen|url=https://web.stanford.edu/~boyd/cvxbook/bv_cvxbook.pdf|title=Convex Optimization|last2=Vandenberghe|first2=Lieven|publisher=[[Cambridge University Press]]|year=2004|isbn=978-0-521-83378-3|access-date=12 Apr 2021
Convex optimization problems can also be solved by the following contemporary methods:<ref>For methods for convex minimization, see the volumes by Hiriart-Urruty and Lemaréchal (bundle) and the textbooks by [[Andrzej Piotr Ruszczyński|Ruszczyński]], [[Dimitri Bertsekas|Bertsekas]], and
Line 352:
| isbn = 978-1-886529-28-1
}}
*{{Cite book|last1=Borwein|first1=Jonathan|url=https://carma.newcastle.edu.au/resources/jon/Preprints/Books/CaNo2/cano2f.pdf|title=Convex Analysis and Nonlinear Optimization: Theory and Examples, Second Edition|last2=Lewis|first2=Adrian|publisher=Springer|year=2000|access-date=12 Apr 2021
* {{cite book
| author1 = Christensen, Peter W.
|