Convex optimization: Difference between revisions

Content deleted Content added
m Applications: Typo fixing, replaced: Chritensen → Christensen per the book's entry in the references section
OAbot (talk | contribs)
m Open access bot: url-access updated in citation with #oabot.
Line 1:
{{short description|Subfield of mathematical optimization}}
'''Convex optimization''' is a subfield of [[mathematical optimization]] that studies the problem of minimizing [[convex function]]s over [[convex set]]s (or, equivalently, maximizing [[concave functions]] over convex sets). Many classes of convex optimization problems admit polynomial-time algorithms,<ref name="Nesterov 1994">{{harvnb|Nesterov|Nemirovskii|1994}}</ref> whereas mathematical optimization is in general [[NP-hard]].<ref>
{{cite journal | last1 = Murty | first1 = Katta | last2 = Kabadi | first2 = Santosh | title = Some NP-complete problems in quadratic and nonlinear programming | journal = Mathematical Programming | volume = 39 | issue = 2 | pages = 117–129 | year = 1987 | doi = 10.1007/BF02592948| hdl = 2027.42/6740 | s2cid = 30500771 | hdl-access = free}}</ref><ref>Sahni, S. "Computationally related problems," in SIAM Journal on Computing, 3, 262--279, 1974.</ref><ref>{{cite journal | url=https://link.springer.com/article/10.1007/BF00120662 | doi=10.1007/BF00120662 | title=Quadratic programming with one negative eigenvalue is NP-hard | date=1991 | last1=Pardalos | first1=Panos M. | last2=Vavasis | first2=Stephen A. | journal=Journal of Global Optimization | volume=1 | pages=15–22 | url-access=subscription }}</ref>
 
==Definition==
Line 117:
*[[Subgradient method]]
*[[Drift plus penalty|Dual subgradients and the drift-plus-penalty method]]
Subgradient methods can be implemented simply and so are widely used.<ref>{{Cite journal |date=2006 |title=Numerical Optimization |url=https://link.springer.com/book/10.1007/978-0-387-40065-5 |journal=Springer Series in Operations Research and Financial Engineering |language=en |doi=10.1007/978-0-387-40065-5|isbn=978-0-387-30303-1 |url-access=subscription }}</ref> Dual subgradient methods are subgradient methods applied to a [[Duality (optimization)|dual problem]]. The [[Drift plus penalty|drift-plus-penalty]] method is similar to the dual subgradient method, but takes a time average of the primal variables.{{Citation needed|date=April 2021}}
 
== Lagrange multipliers ==