Constrained optimization: Difference between revisions

Content deleted Content added
Solution methods: Fixed a typo in the section 'solution methods' - someone had incorrectly reversed the meaning of a sentence by swapping constrained and unconstrained.
GreenC bot (talk | contribs)
Rescued 1 archive link. Wayback Medic 2.5
Line 72:
=====Russian doll search=====
 
This method<ref>Verfaillie, Gérard, Michel Lemaître, and Thomas Schiex. "[https://web.archive.org/web/20180616030142/https://pdfs.semanticscholar.org/c83b/19ca9cc73aefb1a9e7b4780ba161b2149a03.pdf Russian doll search for solving constraint optimization problems]." AAAI/IAAI, Vol. 1. 1996.</ref> runs a branch-and-bound algorithm on <math>n</math> problems, where <math>n</math> is the number of variables. Each such problem is the subproblem obtained by dropping a sequence of variables <math>x_1,\ldots,x_i</math> from the original problem, along with the constraints containing them. After the problem on variables <math>x_{i+1},\ldots,x_n</math> is solved, its optimal cost can be used as an upper bound while solving the other problems,
 
In particular, the cost estimate of a solution having <math>x_{i+1},\ldots,x_n</math> as unassigned variables is added to the cost that derives from the evaluated variables. Virtually, this corresponds on ignoring the evaluated variables and solving the problem on the unassigned ones, except that the latter problem has already been solved. More precisely, the cost of soft constraints containing both assigned and unassigned variables is estimated as above (or using an arbitrary other method); the cost of soft constraints containing only unassigned variables is instead estimated using the optimal solution of the corresponding problem, which is already known at this point.