Constrained optimization: Difference between revisions

Content deleted Content added
No edit summary
OAbot (talk | contribs)
m Open access bot: url-access updated in citation with #oabot.
 
(6 intermediate revisions by 6 users not shown)
Line 1:
{{Short description|Optimizing objective functions that have constrained variables}}
In [[mathematical optimization]], '''constrained optimization''' (in some contexts called '''constraint optimization''') is the process of optimizing an objective function with respect to some [[variable (mathematics)|variables]] in the presence of [[Constraint (mathematics)|constraints]] on those variables. The objective function is either a [[Loss function|cost function]] or [[energy function]], which is to be [[Maxima and minima|minimized]], or a [[reward function]] or [[utility function]], which is to be [[maximize]]d. Constraints can be either '''hard constraints''', which set conditions for the variables that are required to be satisfied, or '''soft constraints''', which have some variable values that are penalized in the objective function if, and based on the extent that, the conditions on the variables are not satisfied.
 
== Relation to constraint-satisfaction problems ==
 
The constrained-optimization problem (COP) is a significant generalization of the classic [[constraint-satisfaction problem]] (CSP) model.<ref>{{Citation|lastlast1=Rossi|firstfirst1=Francesca|title=Chapter 1 – Introduction|date=2006-01-01|url=http://www.sciencedirect.com/science/article/pii/S1574652606800052|work=Foundations of Artificial Intelligence|volume=2|pages=3–12|editor-last=Rossi|editor-first=Francesca|series=Handbook of Constraint Programming|publisher=Elsevier|doi=10.1016/s1574-6526(06)80005-2|access-date=2019-10-04|last2=van Beek|first2=Peter|last3=Walsh|first3=Toby|editor2-last=van Beek|editor2-first=Peter|editor3-last=Walsh|editor3-first=Toby|url-access=subscription}}</ref> COP is a CSP that includes an ''objective function'' to be optimized. Many algorithms are used to handle the optimization part.
 
==General form==
 
A general constrained minimization problem may be written as follows:<ref name="edo2021">{{Cite book|url=https://www.researchgate.net/publication/352413464_Engineering_Design_Optimization352413464|title=Engineering Design Optimization|last1=Martins|first1=J. R. R. A.|last2=Ning|first2=A.|date=2021|publisher=Cambridge University Press|isbn=978-1108833417|language=en}}</ref>
 
: <math>
Line 13 ⟶ 14:
\min &~& f(\mathbf{x}) & \\
\mathrm{subject~to} &~& g_i(\mathbf{x}) = c_i &\text{for } i=1,\ldots,n \quad \text{Equality constraints} \\
&~& h_j(\mathbf{x}) \geqqgeq d_j &\text{for } j=1,\ldots,m \quad \text{Inequality constraints}
\end{array}
</math>
Line 33 ⟶ 34:
====Lagrange multiplier====
{{main|Lagrange multipliers}}
If the constrained problem has only equality constraints, the method of [[Lagrange multipliers]] can be used to convert it into an unconstrained problem whose number of variables is the original number of variables minusplus the original number of equality constraints. Alternatively, if the constraints are all equality constraints and are all linear, they can be solved for some of the variables in terms of the others, and the former can be substituted out of the objective function, leaving an unconstrained problem in a smaller number of variables.
 
===Inequality constraints===
Line 93 ⟶ 94:
* [[Constraint programming]]
* [[Integer programming]]
* [[Metric projection]]
* [[Penalty method]]
* [[Superiorization]]
Line 113 ⟶ 115:
| url-access=registration
}}
 
{{Optimization algorithms}}
 
[[Category:Mathematical optimization]]