Content deleted Content added
ce; restore recently removed spacing after headings |
m Open access bot: url-access updated in citation with #oabot. |
||
(22 intermediate revisions by 18 users not shown) | |||
Line 1:
{{Short description|Optimizing objective functions that have constrained variables}}
In [[mathematical optimization]], '''constrained optimization''' (in some contexts called '''constraint optimization''') is the process of optimizing an objective function with respect to some [[variable (mathematics)|variables]] in the presence of [[Constraint (mathematics)|constraints]] on those variables. The objective function is either a [[Loss function|cost function]] or [[energy function]], which is to be [[Maxima and minima|minimized]], or a [[reward function]] or [[utility function]], which is to be [[maximize]]d. Constraints can be either '''hard constraints''', which set conditions for the variables that are required to be satisfied, or '''soft constraints''', which have some variable values that are penalized in the objective function if, and based on the extent that, the conditions on the variables are not satisfied.
==
The constrained-optimization problem (COP) is a significant generalization of the classic [[constraint-satisfaction problem]] (CSP) model.<ref>{{Citation|
==General form==
A general constrained minimization problem may be written as follows:<ref name="edo2021">{{Cite book|url=https://www.researchgate.net/publication/352413464|title=Engineering Design Optimization|last1=Martins|first1=J. R. R. A.|last2=Ning|first2=A.|date=2021|publisher=Cambridge University Press|isbn=978-1108833417|language=en}}</ref>
: <math>
Line 13 ⟶ 14:
\min &~& f(\mathbf{x}) & \\
\mathrm{subject~to} &~& g_i(\mathbf{x}) = c_i &\text{for } i=1,\ldots,n \quad \text{Equality constraints} \\
&~& h_j(\mathbf{x}) \
\end{array}
</math>
Line 23 ⟶ 24:
==Solution methods==
Many
===Equality constraints===
Line 32 ⟶ 33:
====Lagrange multiplier====
{{main|Lagrange multipliers}}
If the constrained problem has only equality constraints, the method of [[Lagrange multipliers]] can be used to convert it into an unconstrained problem whose number of variables is the original number of variables plus the original number of equality constraints. Alternatively, if the constraints are all equality constraints and are all linear, they can be solved for some of the variables in terms of the others, and the former can be substituted out of the objective function, leaving an unconstrained problem in a smaller number of variables.
Line 54 ⟶ 55:
Allowing inequality constraints, the [[Karush-Kuhn-Tucker conditions|KKT approach]] to nonlinear programming generalizes the method of Lagrange multipliers. It can be applied under differentiability and convexity.
====Branch and bound====
Line 64:
On the other hand, this estimated cost cannot be lower than the effective cost that can be obtained by extending the solution, as otherwise the algorithm could backtrack while a solution better than the best found so far exists. As a result, the algorithm requires an upper bound on the cost that can be obtained from extending a partial solution, and this upper bound should be as small as possible.
A variation of this approach called Hansen's method uses [[Interval arithmetic#History|interval methods]].<ref>{{cite book |last=Leader|first=Jeffery J. | title=Numerical Analysis and Scientific Computation |year=2004|publisher=Addison Wesley
====First-choice bounding functions====
Line 72:
=====Russian doll search=====
This method<ref>Verfaillie, Gérard, Michel Lemaître, and Thomas Schiex. "[https://web.archive.org/web/20180616030142/https://pdfs.semanticscholar.org/c83b/19ca9cc73aefb1a9e7b4780ba161b2149a03.pdf Russian doll search for solving constraint optimization problems]." AAAI/IAAI, Vol. 1. 1996.</ref> runs a branch-and-bound algorithm on <math>n</math> problems, where <math>n</math> is the number of variables. Each such problem is the subproblem obtained by dropping a sequence of variables <math>x_1,\ldots,x_i</math> from the original problem, along with the constraints containing them. After the problem on variables <math>x_{i+1},\ldots,x_n</math> is solved, its optimal cost can be used as an upper bound while solving the other problems,
In particular, the cost estimate of a solution having <math>x_{i+1},\ldots,x_n</math> as unassigned variables is added to the cost that derives from the evaluated variables. Virtually, this corresponds on ignoring the evaluated variables and solving the problem on the unassigned ones, except that the latter problem has already been solved. More precisely, the cost of soft constraints containing both assigned and unassigned variables is estimated as above (or using an arbitrary other method); the cost of soft constraints containing only unassigned variables is instead estimated using the optimal solution of the corresponding problem, which is already known at this point.
There is similarity between the Russian Doll Search method and [[
directly combines the results obtained on sub-problems to get the result of the whole problem, Russian Doll Search only uses them as bounds during its search.
Line 91:
* [[Constrained least squares]]
* [[Distributed constraint optimization]]
*[[Constraint satisfaction problem
* [[Constraint programming]]
* [[Integer programming]]
* [[Metric projection]]
* [[Penalty method]]
* [[Superiorization]]
Line 103 ⟶ 104:
==Further reading==
*{{cite book |first=Dimitri P. |last=Bertsekas |
*{{cite book
| first=Rina
Line 114 ⟶ 115:
| url-access=registration
}}
{{Optimization algorithms}}
[[Category:Mathematical optimization]]
|