Content deleted Content added
→Algorithms: cite |
Erel Segal (talk | contribs) A "laundry list" of applications is not useful in the lead section - moved it to the applications section. |
||
Line 1:
{{short description|Subfield of mathematical optimization}}
'''Convex optimization''' is a subfield of [[mathematical optimization]] that studies the problem of minimizing [[convex function]]s over [[convex set]]s (or, equivalently, maximizing [[concave functions]] over convex sets). Many classes of convex optimization problems admit polynomial-time algorithms,<ref name="Nesterov 1994">{{harvnb|Nesterov|Nemirovskii|1994}}</ref> whereas mathematical optimization is in general [[NP-hard]].<ref>
{{cite journal | last1 = Murty | first1 = Katta | last2 = Kabadi | first2 = Santosh | title = Some NP-complete problems in quadratic and nonlinear programming | journal = Mathematical Programming | volume = 39 | issue = 2 | pages = 117–129 | year = 1987 | doi = 10.1007/BF02592948| hdl = 2027.42/6740 | s2cid = 30500771 | hdl-access = free}}</ref><ref>Sahni, S. "Computationally related problems," in SIAM Journal on Computing, 3, 262--279, 1974.</ref><ref>[https://link.springer.com/article/10.1007/BF00120662 Quadratic programming with one negative eigenvalue is NP-hard], Panos M. Pardalos and Stephen A. Vavasis in ''Journal of Global Optimization'', Volume 1, Number 1, 1991, pg.15-22.</ref> With recent advancements in computing and [[Mathematical optimization#Computational optimization techniques|optimization algorithms]], convex programming is nearly as straightforward as [[linear programming]].<ref>{{harvnb|Boyd|Vandenberghe|2004|p=8}}</ref>
Convex optimization has applications in a wide range of disciplines, such as automatic [[control systems]], estimation and [[signal processing]], communications and networks, electronic [[circuit design]],<ref>{{harvnb|Boyd|Vandenberghe|2004|p=17}}</ref> data analysis and modeling, [[finance]], [[statistics]] ([[optimal design|optimal experimental design]]),<ref>Chritensen/Klarbring, chpt. 4.</ref> and [[structural optimization]], where the approximation concept has proven to be efficient.<ref>{{harvnb|Boyd|Vandenberghe|2004}}</ref><ref>Schmit, L.A.; Fleury, C. 1980: ''Structural synthesis by combining approximation concepts and dual methods''. J. Amer. Inst. Aeronaut. Astronaut 18, 1252-1260</ref> ▼
==Definition==
Line 51 ⟶ 48:
\end{align}</math>
==
The following are useful properties of convex optimization problems:<ref name="rockafellar93">{{cite journal | author = Rockafellar, R. Tyrrell | title = Lagrange multipliers and optimality | journal = SIAM Review | volume = 35 | issue = 2 | year = 1993 | pages = 183–238 |url = http://web.williams.edu/Mathematics/sjmiller/public_html/105Sp10/handouts/Rockafellar_LagrangeMultAndOptimality.pdf | doi=10.1137/1035044| citeseerx = 10.1.1.161.7209}}</ref><ref name="bv4"/>▼
* every [[local minimum]] is a [[global minimum]];▼
* the optimal set is convex;▼
* if the objective function is ''strictly'' convex, then the problem has at most one optimal point.▼
These results are used by the theory of convex minimization along with geometric notions from [[functional analysis]] (in Hilbert spaces) such as the [[Hilbert projection theorem]], the [[separating hyperplane theorem]], and [[Farkas' lemma]].{{citation needed|date=July 2022}}▼
The following problem classes are all convex optimization problems, or can be reduced to convex optimization problems via simple transformations:<ref name="bv4" /><ref name="rewriting">
{{cite journal
[[File:Hierarchy compact convex.png|thumb|A hierarchy of convex optimization problems. (LP: linear program, QP: quadratic program, SOCP second-order cone program, SDP: semidefinite program, CP: cone program.)]]
Line 76 ⟶ 64:
*[[Semidefinite programming]]
*[[Entropy maximization]] with appropriate constraints
==Properties==
▲The following are useful properties of convex optimization problems:<ref name="rockafellar93">{{cite journal | author = Rockafellar, R. Tyrrell | title = Lagrange multipliers and optimality | journal = SIAM Review | volume = 35 | issue = 2 | year = 1993 | pages = 183–238 |url = http://web.williams.edu/Mathematics/sjmiller/public_html/105Sp10/handouts/Rockafellar_LagrangeMultAndOptimality.pdf | doi=10.1137/1035044| citeseerx = 10.1.1.161.7209}}</ref><ref name="bv4"/>
▲* every [[local minimum]] is a [[global minimum]];
▲* the optimal set is convex;
▲* if the objective function is ''strictly'' convex, then the problem has at most one optimal point.
▲These results are used by the theory of convex minimization along with geometric notions from [[functional analysis]] (in Hilbert spaces) such as the [[Hilbert projection theorem]], the [[separating hyperplane theorem]], and [[Farkas' lemma]].{{citation needed|date=July 2022}}
== Practical applications ==
▲Convex optimization has applications in a wide range of disciplines, such as automatic [[control systems]], estimation and [[signal processing]], communications and networks, electronic [[circuit design]],<ref>{{harvnb|Boyd|Vandenberghe|2004|p=17}}</ref> data analysis and modeling, [[finance]], [[statistics]] ([[optimal design|optimal experimental design]]),<ref>Chritensen/Klarbring, chpt. 4.</ref> and [[structural optimization]], where the approximation concept has proven to be efficient.<ref>{{harvnb|Boyd|Vandenberghe|2004}}</ref><ref>Schmit, L.A.; Fleury, C. 1980: ''Structural synthesis by combining approximation concepts and dual methods''. J. Amer. Inst. Aeronaut. Astronaut 18, 1252-1260</ref> Convex optimization has practical applications for the following fields:
* [[Portfolio optimization]].<ref name=":0">{{Cite web|last1=Boyd|first1=Stephen|last2=Diamond|first2=Stephen|last3=Zhang|first3=Junzi|last4=Agrawal|first4=Akshay|title=Convex Optimization Applications|url=https://web.stanford.edu/~boyd/papers/pdf/cvx_applications.pdf|url-status=live|access-date=12 Apr 2021|archive-url=https://web.archive.org/web/20151001185038/http://web.stanford.edu/~boyd/papers/pdf/cvx_applications.pdf |archive-date=2015-10-01 }}</ref>
|