Content deleted Content added
Corrected Schaffer function n°4 ... see http://infinity77.net/global_optimization/test_functions_nd_S.html |
m Make the symbol consistent |
||
(179 intermediate revisions by 87 users not shown) | |||
Line 1:
In applied mathematics, '''test functions''', known as '''artificial landscapes''', are useful to evaluate characteristics of optimization algorithms, such as [[Rate of convergence|convergence rate]], precision, robustness and general performance.
Here some test functions are presented with the aim of giving an idea about the different situations that optimization algorithms have to face when coping with these kinds of problems. In the first part, some objective functions for single-objective optimization cases are presented. In the second part, test functions with their respective [[Pareto front|Pareto fronts]] for [[multi-objective optimization]] problems (MOP) are given.
The artificial landscapes presented herein for single-objective optimization problems are taken from Bäck,<ref>{{cite book|last=Bäck|first=Thomas|title=Evolutionary algorithms in theory and practice : evolution strategies, evolutionary programming, genetic algorithms|year=1995|publisher=Oxford University Press|___location=Oxford|isbn=978-0-19-509971-
The test functions used to evaluate the algorithms for MOP were taken from Deb,<ref name="Deb:2002">Deb, Kalyanmoy (2002) Multiobjective optimization using evolutionary algorithms (Repr. ed.). Chichester [u.a.]: Wiley.
Just a general form of the equation, a plot of the objective function, boundaries of the object variables and the coordinates of global minima are given herein.
==Test functions for single-objective optimization
{| class="sortable wikitable"
! Name
! Plot
! Formula
! Global minimum
! Search ___domain
|-
| [[Rastrigin function]]
| [[File:Rastrigin contour plot.svg|200px|Rastrigin function for n=2]]
|<math>f(\mathbf{x}) = A n + \sum_{i=1}^n \left[x_i^2 - A\cos(2 \pi x_i)\right]</math>
<math>\text{where: } A=10</math>
|<math>f(0, \dots, 0) = 0</math>
|<math>-5.12\le x_{i} \le 5.12 </math>
|-
| [[Ackley
<math>-\exp\left
|-
| Sphere function
|-
| [[Rosenbrock function]]
\begin{cases}
n=2 & \rightarrow \quad f(1,1) = 0, \\
n=3 & \rightarrow \quad f(1,1,1) = 0, \\
n>3 & \rightarrow \quad f
\end{cases}
</math>
|-
| [[Beale
<math>+ \left(2.625 - x+ xy^{3}\right)^{2}</math>
|-
| [[Goldstein–Price function
<math>\left
|-
| [[Booth
|-
| Bukin function N.6
|-
| [[Matyas function
|-
| Lévi function N.13
<math>+\left(y-1\right)^{2}\left(1+\sin^{2}
|-
| [[Griewank function]]
| [[File:Griewank 2D Contour.svg|200px|Griewank's function]]
| <math>f(\boldsymbol{x})= 1+ \frac {1}{4000} \sum _{i=1}^n x_i^2 -\prod _{i=1}^n P_i(x_i)</math>, where <math>P_i(x_i)=\cos \left( \frac {x_i}{\sqrt {i}} \right)</math>
|<math>f(0, \dots, 0) = 0</math>
|<math>-\infty \le x_{i} \le \infty</math>, <math>1 \le i \le n</math>
|-
| [[Himmelblau's function]]
|[[File:Himmelblau contour plot.svg|200px|Himmelblau's function]]
| <math>f(x, y) = (x^2+y-11)^2 + (x+y^2-7)^2.\quad</math>
| <math>\text{Min} =
\begin{cases}
f\left(3.0, 2.0\right) & = 0.0 \\
f\left(-2.805118, 3.131312\right) & = 0.0 \\
f\left(-3.779310, -3.283186\right) & = 0.0 \\
f\left(3.584428, -1.848126\right) & = 0.0 \\
\end{cases}
</math>
| <math>-5\le x,y \le 5</math>
|-
| Three-hump camel function
|
|-
| [[Easom function
|-
| Cross-in-tray function
\begin{cases}
f\left(1.34941, -1.34941\right) & = -2.06261 \\
Line 106 ⟶ 132:
\end{cases}
</math>
|
|-
| [[Eggholder function]]<ref name="Whitley Rana Dzubera Mathias 1996 pp. 245–276">{{cite journal | last1=Whitley | first1=Darrell | last2=Rana | first2=Soraya | last3=Dzubera | first3=John | last4=Mathias | first4=Keith E. | title=Evaluating evolutionary algorithms | journal=Artificial Intelligence | publisher=Elsevier BV | volume=85 | issue=1–2 | year=1996 | issn=0004-3702 | doi=10.1016/0004-3702(95)00124-7 | pages=264| doi-access=free }}</ref><ref name="vanaret2015hybridation">Vanaret C. (2015) [https://www.researchgate.net/publication/337947149_Hybridization_of_interval_methods_and_evolutionary_algorithms_for_solving_difficult_optimization_problems Hybridization of interval methods and evolutionary algorithms for solving difficult optimization problems.] PhD thesis. Ecole Nationale de l'Aviation Civile. Institut National Polytechnique de Toulouse, France.</ref>
|-
| [[Hölder table function
\begin{cases}
f\left(8.05502, 9.66459\right) & = -19.2085 \\
Line 125 ⟶ 151:
\end{cases}
</math>
|-
| [[McCormick function
|-
| Schaffer function N. 2
|-
| Schaffer function N. 4
\begin{cases}
f\left(0,1.25313\right) & = 0.292579 \\
f\left(0,-1.25313\right) & = 0.292579 \\
f\left(1.25313,0\right) & = 0.292579 \\
f\left(-1.25313,0\right) & = 0.292579
\end{cases}
</math>
| <math>-100\le x,y \le 100</math>
|-
| [[Styblinski–Tang function
|-
| [[Shekel function]]
</math>
|
|}
==Test functions for
{| class="wikitable" style="text-align:center"
|-
! Name !! Plot !! Formula !! Global minimum !! Search ___domain
|-
| Rosenbrock function constrained to a disk<ref>{{Cite web|url=https://www.mathworks.com/help/optim/ug/example-nonlinear-constrained-minimization.html?requestedDomain=www.mathworks.com|title=Solve a Constrained Nonlinear Problem - MATLAB & Simulink|website=www.mathworks.com|access-date=2017-08-29}}</ref>
|| [[File:Rosenbrock circle constraint.svg|200px|Rosenbrock function constrained to a disk]]
|| <math>f(x,y) = (1-x)^2 + 100(y-x^2)^2</math>,
subjected to: <math> x^2 + y^2 \le 2 </math>
|| <math>f(1.0,1.0) = 0</math>
|| <math>-1.5\le x \le 1.5</math>, <math>-1.5\le y \le 1.5</math>
|-
| Mishra's Bird function - constrained<ref>{{Cite web|url=http://www.phoenix-int.com/software/benchmark_report/bird_constrained.php|title=Bird Problem (Constrained) {{!}} Phoenix Integration|access-date=2017-08-29|url-status=bot: unknown|archive-url=https://web.archive.org/web/20161229032528/http://www.phoenix-int.com/software/benchmark_report/bird_constrained.php|archive-date=2016-12-29}}</ref><ref>{{Cite journal|last=Mishra|first=Sudhanshu|date=2006|title=Some new test functions for global optimization and performance of repulsive particle swarm method|url=https://mpra.ub.uni-muenchen.de/2718/|journal=MPRA Paper}}</ref>
|| [[File:Mishra bird contour.svg|200px|Bird function (constrained)]]
|| <math>f(x,y) = \sin(y) e^{\left [(1-\cos x)^2\right]} + \cos(x) e^{\left [(1-\sin y)^2 \right]} + (x-y)^2</math>,
subjected to: <math> (x+5)^2 + (y+5)^2 < 25 </math>
|| <math>f(-3.1302468,-1.5821422) = -106.7645367</math>
|| <math>-10\le x \le 0</math>, <math>-6.5\le y \le 0</math>
|-
| Townsend function (modified)<ref>{{Cite web|url=http://www.chebfun.org/examples/opt/ConstrainedOptimization.html|title=Constrained optimization in Chebfun|last=Townsend|first=Alex|date=January 2014|website=chebfun.org|access-date=2017-08-29}}</ref>
|| [[File:Townsend contour.svg|200px|Heart constrained multimodal function]]
|| <math>f(x,y) = -[\cos((x-0.1)y)]^2 - x \sin(3x+y)</math>,
subjected to:<math>x^2+y^2 < \left[2\cos t - \frac 1 2 \cos 2t - \frac 1 4 \cos 3t - \frac 1 8 \cos 4t\right]^2 + [2\sin t]^2 </math>
where: {{Math|1=''t'' = Atan2(x,y)}}
|| <math>f(2.0052938,1.1944509) = -2.0239884</math>
|| <math>-2.25\le x \le 2.25</math>, <math>-2.5\le y \le 1.75</math>
|-
| '''Keane's bump function'''{{anchor|Keane's bump function}}<ref>{{cite journal |last1=Mishra |first1=Sudhanshu |title=Minimization of Keane’s Bump Function by the Repulsive Particle Swarm and the Differential Evolution Methods |date=5 May 2007 |url=https://econpapers.repec.org/paper/pramprapa/3098.htm |journal=MPRA Paper|publisher=University Library of Munich, Germany}}</ref>
|| [[File:Estimation of Distribution Algorithm animation.gif|200px|Keane's bump function]]
|| <math>f(\boldsymbol{x}) = -\left| \frac{\left[ \sum_{i=1}^m \cos^4 (x_i) - 2 \prod_{i=1}^m \cos^2 (x_i) \right]}{{\left( \sum_{i=1}^m ix^2_i \right)}^{0.5}} \right| </math>,
subjected to: <math> 0.75 - \prod_{i=1}^m x_i < 0 </math>, and
<math> \sum_{i=1}^m x_i - 7.5m < 0 </math>
|| <math>f((1.60025376,0.468675907)) = -0.364979746</math>
|| <math>0 < x_i < 10</math>
|}
==Test functions for multi-objective optimization==
{{explain|reason=What does it mean to minimize two objective functions?|date=September 2016}}
{| class="wikitable" style="text-align:center"
|-
! Name !! Plot !! Functions !! Constraints !! Search ___domain
|-
| [[Binh and Korn function]]:<ref name="Binh97"/>
|| [[File:Binh and Korn function.pdf|200px|Binh and Korn function]]
|| <math>\text{Minimize} =
\begin{cases}
f_{1}\left(x,y\right)
f_{2}\left(x,y\right)
\end{cases}
</math>
||<math>\text{s.t.} =
\begin{cases}
g_{1}\left(x,y\right)
g_{2}\left(x,y\right)
\end{cases}
</math>
|| <math>0\le x \le 5</math>, <math>0\le y \le 3</math>
|-
| [[Chankong and Haimes function]]:<ref>{{cite book |last1=Chankong |first1=Vira |last2=Haimes |first2=Yacov Y. |title=Multiobjective decision making. Theory and methodology. |isbn=0-444-00710-5|year=1983 |publisher=North Holland }}</ref>
|| [[File:Chakong and Haimes function.pdf|200px|Chakong and Haimes function]]
|| <math>\text{Minimize} =
\begin{cases}
f_{1}\left(x,y\right)
f_{2}\left(x,y\right)
\end{cases}
</math>
|| <math>\text{s.t.} =
\begin{cases}
g_{1}\left(x,y\right)
g_{2}\left(x,y\right)
\end{cases}
</math>
|| <math>-20\le x,y \le 20</math>
|-
| [[Fonseca–Fleming function]]:<ref name="FonzecaFleming:1995">{{cite journal |first1=C. M. |last1=Fonseca |first2=P. J. |last2=Fleming |title=An Overview of Evolutionary Algorithms in Multiobjective Optimization |journal=[[Evolutionary Computation (journal)|Evol Comput]] |volume=3 |issue=1 |pages=1–16 |year=1995 |doi=10.1162/evco.1995.3.1.1 |citeseerx=10.1.1.50.7779 |s2cid=8530790 }}</ref>
|| [[File:Fonseca and Fleming function.pdf|200px|Fonseca and Fleming function]]
|| <math>\text{Minimize} =
\begin{cases}
f_{1}\left(\boldsymbol{x}\right)
f_{2}\left(\boldsymbol{x}\right)
\end{cases}
</math>
Line 210 ⟶ 284:
|-
| Test function 4:<ref name="Binh99"/>
|| [[File:Test function 4 - Binh.pdf|200px|Test function 4.<ref name="Binh99" />]]
|| <math>\text{Minimize} =
\begin{cases}
f_{1}\left(x,y\right)
f_{2}\left(x,y\right)
\end{cases}
</math>
|| <math>\text{s.t.} =
\begin{cases}
g_{1}\left(x,y\right)
g_{2}\left(x,y\right)
g_{3}\left(x,y\right)
\end{cases}
</math>
|| <math>-7\le x,y \le 4</math>
|-
| [[Kursawe function]]:<ref name="Kursawe:1991">F. Kursawe, “[http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.47.8050&rep=rep1&type=pdf A variant of evolution strategies for vector optimization],” in [[Parallel Problem Solving from Nature|PPSN]] I, Vol 496 Lect Notes in Comput Sc. Springer-Verlag, 1991, pp. 193–197.</ref>
|| [[File:Kursawe function.pdf|200px|Kursawe function]]
|| <math>\text{Minimize} =
\begin{cases}
f_{1}\left(\boldsymbol{x}\right)
& \\
f_{2}\left(\boldsymbol{x}\right)
\end{cases}
</math>
Line 238 ⟶ 312:
||<math>-5\le x_{i} \le 5</math>, <math>1\le i \le 3</math>.
|-
| Schaffer function N. 1:<ref name="Schaffer:1984">{{cite book |last=Schaffer |first=J. David |date=1984 |chapter=Multiple Objective Optimization with Vector Evaluated Genetic Algorithms |title=Proceedings of the First International Conference on Genetic Algorithms |editor1=G.J.E Grefensette |editor2=J.J. Lawrence Erlbraum |oclc=20004572 }}</ref>
|| [[File:Schaffer function 1.pdf|200px|Schaffer function N.1]]
|| <math>\text{Minimize} =
\begin{cases}
f_{1}\left(x\right)
f_{2}\left(x\right)
\end{cases}
</math>
||
|| <math>-A\le x \le A</math>. Values of <math>A</math>
|-
| Schaffer function N. 2:
Line 253 ⟶ 327:
|| <math>\text{Minimize} =
\begin{cases}
f_{1}\left(x\right)
-x, & \text{if } x \le 1 \\
x-2, & \text{if } 1 < x \le 3 \\
Line 259 ⟶ 333:
x-4, & \text{if } x > 4 \\
\end{cases} \\
f_{2}\left(x\right)
\end{cases}
</math>
Line 269 ⟶ 343:
|| <math>\text{Minimize} =
\begin{cases}
f_{1}\left(x,y\right)
f_{2}\left(x,y\right)
\end{cases}
</math>
<math>\text{where} =
\begin{cases}
A_{1}
A_{2}
B_{1}\left(x,y\right)
B_{2}\left(x,y\right)
\end{cases}
</math>
Line 284 ⟶ 358:
||<math>-\pi\le x,y \le \pi</math>
|-
| Zitzler–Deb–Thiele's function N. 1:<ref name="Debetal2002testpr">{{cite book |last1=Deb |first1=Kalyan |last2=Thiele |first2=L. |last3=Laumanns |first3=Marco |last4=Zitzler |first4=Eckart |title=Proceedings of the 2002 Congress on Evolutionary Computation. CEC'02 (Cat. No.02TH8600) |chapter=Scalable multi-objective optimization test problems |date=2002 |volume=1 |pages=825–830 |doi=10.1109/CEC.2002.1007032|isbn=0-7803-7282-4 |s2cid=61001583 }}</ref>
|| [[File:Zitzler-Deb-Thiele's function 1.pdf|200px|Zitzler-Deb-Thiele's function N.1]]
|| <math>\text{Minimize} =
\begin{cases}
f_{1}\left(\boldsymbol{x}\right)
f_{2}\left(\boldsymbol{x}\right)
g\left(\boldsymbol{x}\right)
h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right)
\end{cases}
</math>
Line 297 ⟶ 371:
||<math>0\le x_{i} \le 1</math>, <math>1\le i \le 30</math>.
|-
| Zitzler–Deb–Thiele's function N. 2:<ref name="Debetal2002testpr" />
|| [[File:Zitzler-Deb-Thiele's function 2.pdf|200px|Zitzler-Deb-Thiele's function N.2]]
|| <math>\text{Minimize} =
\begin{cases}
f_{1}\left(\boldsymbol{x}\right)
f_{2}\left(\boldsymbol{x}\right)
g\left(\boldsymbol{x}\right)
h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right)
\end{cases}
</math>
Line 310 ⟶ 384:
|| <math>0\le x_{i} \le 1</math>, <math>1\le i \le 30</math>.
|-
| Zitzler–Deb–Thiele's function N. 3:<ref name="Debetal2002testpr" />
|| [[File:Zitzler-Deb-Thiele's function 3.pdf|200px|Zitzler-Deb-Thiele's function N.3]]
||<math>\text{Minimize} =
\begin{cases}
f_{1}\left(\boldsymbol{x}\right)
f_{2}\left(\boldsymbol{x}\right)
g\left(\boldsymbol{x}\right)
h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right)
\end{cases}
</math>
Line 323 ⟶ 397:
||<math>0\le x_{i} \le 1</math>, <math>1\le i \le 30</math>.
|-
| Zitzler–Deb–Thiele's function N. 4:<ref name="Debetal2002testpr" />
|| [[File:Zitzler-Deb-Thiele's function 4.pdf|
|| <math>\text{Minimize} =
\begin{cases}
f_{1}\left(\boldsymbol{x}\right)
f_{2}\left(\boldsymbol{x}\right)
g\left(\boldsymbol{x}\right)
h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right)
\end{cases}
</math>
Line 336 ⟶ 410:
||<math>0\le x_{1} \le 1</math>, <math>-5\le x_{i} \le 5</math>, <math>2\le i \le 10</math>
|-
| Zitzler–Deb–Thiele's function N. 6:<ref name="Debetal2002testpr" />
|| [[File:Zitzler-Deb-Thiele's function 6.pdf|200px|Zitzler-Deb-Thiele's function N.6]]
||<math>\text{Minimize} =
\begin{cases}
f_{1}\left(\boldsymbol{x}\right)
f_{2}\left(\boldsymbol{x}\right)
g\left(\boldsymbol{x}\right)
h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right)
\end{cases}
</math>
Line 349 ⟶ 423:
||<math>0\le x_{i} \le 1</math>, <math>1\le i \le 10</math>.
|-
| Osyczka and Kundu function:<ref name="OsyczkaKundu1995">{{cite journal |last1=Osyczka |first1=A. |last2=Kundu |first2=S. |title=A new method to solve generalized multicriteria optimization problems using the simple genetic algorithm |journal=Structural Optimization |date=1 October 1995 |volume=10 |issue=2 |pages=94–99 |doi=10.1007/BF01743536 |s2cid=123433499 |issn=1615-1488}}</ref>
|| [[File:Osyczka and Kundu function.pdf|200px|Osyczka and Kundu function]]
||<math>\text{Minimize} =
\begin{cases}
f_{1}\left(\boldsymbol{x}\right)
- \left(x_{4}-4\right)^{2} - \left(x_{5}-1\right)^{2} \\
f_{2}\left(\boldsymbol{x}\right)
\end{cases}
</math>
||<math>\text{s.t.} =
\begin{cases}
g_{1}\left(\boldsymbol{x}\right)
g_{2}\left(\boldsymbol{x}\right)
g_{3}\left(\boldsymbol{x}\right)
g_{4}\left(\boldsymbol{x}\right)
g_{5}\left(\boldsymbol{x}\right)
g_{6}\left(\boldsymbol{x}\right)
\end{cases}
</math>
|| <math>0\le x_{1},x_{2},x_{6} \le 10</math>, <math>1\le x_{3},x_{5} \le 5</math>, <math>0\le x_{4} \le 6</math>.
|-
| CTP1 function (2 variables):<ref name="Deb:2002"/><ref name="Jimenezetal2002">{{cite book |last1=Jimenez |first1=F. |last2=Gomez-Skarmeta |first2=A. F. |last3=Sanchez |first3=G. |last4=Deb |first4=K. |title=Proceedings of the 2002 Congress on Evolutionary Computation. CEC'02 (Cat. No.02TH8600) |chapter=An evolutionary algorithm for constrained multi-objective optimization |date=May 2002 |volume=2 |pages=1133–1138 |doi=10.1109/CEC.2002.1004402|isbn=0-7803-7282-4 |s2cid=56563996 }}</ref>
|| [[File:CTP1 function (2 variables).pdf|200px|CTP1 function (2 variables).<ref name="Deb:2002" />]]
|| <math>\text{Minimize} =
\begin{cases}
f_{1}\left(x,y\right)
f_{2}\left(x,y\right)
\end{cases}
</math>
||<math>\text{s.t.} =
\begin{cases}
g_{1}\left(x,y\right)
g_{
\end{cases}
</math>
Line 399 ⟶ 461:
|-
| Constr-Ex problem:<ref name="Deb:2002"/>
|| [[File:Constr-Ex problem.pdf|200px|Constr-Ex problem.<ref name="Deb:2002" />]]
|| <math>\text{Minimize} =
\begin{cases}
f_{1}\left(x,y\right)
f_{2}\left(x,y\right)
\end{cases}
</math>
|| <math>\text{s.t.} =
\begin{cases}
g_{1}\left(x,y\right)
g_{
\end{cases}
</math>
|| <math>0.1\le x \le 1</math>, <math>0\le y \le 5</math>
|-
| Viennet function:
|| [[File:Viennet function.pdf|200px|Viennet function]]
|| <math>\text{Minimize} =
\begin{cases}
f_{1}\left(x,y\right) = 0.5\left(x^{2} + y^{2}\right) + \sin\left(x^{2} + y^{2} \right) \\
f_{2}\left(x,y\right) = \frac{\left(3x - 2y + 4\right)^{2}}{8} + \frac{\left(x - y + 1\right)^{2}}{27} + 15 \\
f_{3}\left(x,y\right) = \frac{1}{x^{2} + y^{2} + 1} - 1.1 \exp \left(- \left(x^{2} + y^{2} \right) \right) \\
\end{cases}
</math>
||
||<math>-3\le x,y \le 3</math>.
|}
==References==
<references/>
== External links ==
* [https://github.com/nathanrooy/landscapes landscapes]
{{DEFAULTSORT:Test functions for optimization}}
[[Category:Constraint programming]]
[[Category:Convex optimization]]
[[Category:
[[Category:Test items]]
|