Content deleted Content added
→top: | Altered template type. Add: pages, volume, journal, date, title, doi, authors 1-2. Changed bare reference to CS1/2. | Use this tool. Report bugs. | #UCB_Gadget |
Citation bot (talk | contribs) Add: isbn, publisher. | Use this bot. Report bugs. | Suggested by Josve05a | #UCB_toolbar |
||
Line 6:
=== Abstract form ===
A convex optimization problem is defined by two ingredients:<ref>{{cite book |last1=Hiriart-Urruty |first1=Jean-Baptiste |url=https://books.google.com/books?id=Gdl4Jc3RVjcC&q=lemarechal+convex+analysis+and+minimization |title=Convex analysis and minimization algorithms: Fundamentals |last2=Lemaréchal |first2=Claude |year=1996 |isbn=9783540568506 |page=291|publisher=Springer }}</ref><ref>{{cite book |last1=Ben-Tal |first1=Aharon |url=https://books.google.com/books?id=M3MqpEJ3jzQC&q=Lectures+on+Modern+Convex+Optimization:+Analysis,+Algorithms, |title=Lectures on modern convex optimization: analysis, algorithms, and engineering applications |last2=Nemirovskiĭ |first2=Arkadiĭ Semenovich |year=2001 |isbn=9780898714913 |pages=335–336}}</ref>
* The ''objective function'', which is a real-valued [[convex function]] of ''n'' variables, <math>f :\mathcal D \subseteq \mathbb{R}^n \to \mathbb{R}</math>;
Line 117:
*[[Subgradient method]]
*[[Drift plus penalty|Dual subgradients and the drift-plus-penalty method]]
Subgradient methods can be implemented simply and so are widely used.<ref>{{Cite journal |date=2006 |title=Numerical Optimization |url=https://link.springer.com/book/10.1007/978-0-387-40065-5 |journal=Springer Series in Operations Research and Financial Engineering |language=en |doi=10.1007/978-0-387-40065-5|isbn=978-0-387-30303-1 }}</ref> Dual subgradient methods are subgradient methods applied to a [[Duality (optimization)|dual problem]]. The [[Drift plus penalty|drift-plus-penalty]] method is similar to the dual subgradient method, but takes a time average of the primal variables.{{Citation needed|date=April 2021}}
== Lagrange multipliers ==
|