Test functions for optimization: Difference between revisions

Content deleted Content added
Change format to table
Trobolt (talk | contribs)
m Make the symbol consistent
 
(207 intermediate revisions by more than 100 users not shown)
Line 1:
In{{Short applieddescription|Functions mathematics, '''test functions''', known as '''artificial landscapes''', are usefulused to evaluate characteristics of optimization algorithms, such as:}}
In applied mathematics, '''test functions''', known as '''artificial landscapes''', are useful to evaluate characteristics of optimization algorithms, such as [[Rate of convergence|convergence rate]], precision, robustness and general performance.
* Velocity of convergence.
* Precision.
* Robustness.
* General performance.
 
Here some test functions are presented with the aim of giving an idea about the different situations that optimization algorithms have to face when coping with these kindkinds of problems. In the first part, some objective functions for single-objective optimization cases are presented. In the second part, test functions with their respective [[Pareto front|Pareto fronts]] for [[multi-objective optimization]] problems (MOP) are given.
 
The artificial landscapes presented herein for single-objective optimization problems are taken from Bäck,<ref>{{cite book|last=Bäck|first=Thomas|title=Evolutionary algorithms in theory and practice : evolution strategies, evolutionary programming, genetic algorithms|year=1995|publisher=Oxford University Press|___location=Oxford|isbn=978-0-19-509971-03|pagespage=328}}</ref> Haupt et. al.<ref>{{cite book|last=Haupt|first=Randy L. Haupt, Sue Ellen|title=Practical genetic algorithms with DCCD-Rom|year=2004|publisher=J. Wiley|___location=New York|isbn=978-0-471-45565-23|edition=2nd ed.}}</ref> and from Rody Oldenhuis software.<ref>{{cite web|last=Oldenhuis|first=Rody|title=Many test functions for global optimizers|url=http://www.mathworks.com/matlabcentral/fileexchange/23147-many-testfunctions-for-global-optimizers|publisher=Mathworks|accessdateaccess-date=1 November 2012}}</ref> Given the amountnumber of problems (55 in total), just a few are presented here. The complete list of test functions is found on the Mathworks website.<ref>{{cite web|last=Ortiz|first=Gilberto A.|title=Evolution Strategies (ES)|url=http://www.mathworks.com/matlabcentral/fileexchange/35801-evolution-strategies-es|publisher=Mathworks|accessdate=1 November 2012}}</ref>
 
The test functions used to evaluate the algorithms for MOP were taken from Deb,<ref name="Deb:2002">Deb, Kalyanmoy (2002) Multiobjective optimization using evolutionary algorithms (Repr. ed.). Chichester [u.a.]: Wiley. ISBN {{isbn|0-471-87339-X}}.</ref> Binh et. al.<ref name="Binh97">Binh T. and Korn U. (1997) [https://web.archive.org/web/20190801183649/https://pdfs.semanticscholar.org/cf68/41a6848ca2023342519b0e0e536b88bdea1d.pdf MOBES: A Multiobjective Evolution Strategy for Constrained Optimization Problems]. In: Proceedings of the Third International Conference on Genetic Algorithms. Czech Republic. pp. 176-182176–182</ref> and Binh.<ref name="Binh99">Binh T. (1999) [https://www.researchgate.net/profile/Thanh_Binh_To/publication/2446107_A_Multiobjective_Evolutionary_Algorithm_The_Study_Cases/links/53eb422f0cf28f342f45251d.pdf A multiobjective evolutionary algorithm. The study cases.] Technical report. Institute for Automation and Communication. Barleben, Germany</ref> You can download theThe software developed by Deb can be downloaded,<ref name="Deb_nsga">Deb K. (2011) Software for multi-objective NSGA-II code in C. Available at URL:http https://www.iitk.ac.in/kangal/codes.shtml. Revision 1.1.6</ref> which implements the NSGA-II procedure with GAs, or the program posted on Internet,<ref>{{cite web|last=Ortiz|first=Gilberto A.|title=Multi-objective optimization using ES as Evolutionary Algorithm.|url=http://www.mathworks.com/matlabcentral/fileexchange/35824-multi-objective-optimization-using-evolution-strategies-es-as-evolutionary-algorithm-ea|publisher=Mathworks|accessdateaccess-date=1 November 2012}}</ref> which implements the NSGA-II procedure with ES.
 
Just a general form of the equation, a plot of the objective function, boundaries of the object variables and the coordinates of global minima are given herein.
 
==Test functions for single-objective optimization problems==
 
{| class="wikitable" style="text-align:center"
{| class="sortable wikitable"
|+Test functions for single-objective optimization problems
! Name
! Plot
! Formula
! Global minimum
! Search ___domain
|-
| [[Rastrigin function]]
! Name !! Formula !! Minimum !! Search ___domain !! Plot
| [[File:Rastrigin contour plot.svg|200px|Rastrigin function for n=2]]
|<math>f(\mathbf{x}) = A n + \sum_{i=1}^n \left[x_i^2 - A\cos(2 \pi x_i)\right]</math>
<math>\text{where: } A=10</math>
|<math>f(0, \dots, 0) = 0</math>
|<math>-5.12\le x_{i} \le 5.12 </math>
|-
| [[Ackley function]]
| [[File:Ackley contour function.svg|200px|Ackley's function for n=2]]
|<math>f(x,y) = -20\exp\left[-0.2\sqrt{0.5\left(x^{2}+y^{2}\right)}\right]</math>
<math>-\exp\left[0.5\left(\cos 2\pi x + \cos 2\pi y \right)\right] + e + 20</math>
|<math>f(0,0) = 0</math>
|<math>-5\le x,y \le 5</math>
|-
| Sphere function
| [[File:Sphere contour.svg|200px|Sphere function for n=2]]
|| <math>f(\boldsymbol{x}) = \sum_{i=1}^{n} x_{i}^{2}.\quad</math>
|| <math>f(x_{1}, \dots, x_boldsymbol{nx}) = f(0, \dots, 0) sum_{i=1}^{n} 0x_{i}^{2}</math>
|| <math>-\inftyf(x_{1}, \ledots, x_{in}) \le= \infty</math>f(0, <math>1 \ledots, i0) \le= n0</math>
| <math>-\infty \le x_{i} \le \infty</math>, <math>1 \le i \le n</math>
|| [[File:Sphere function in 3D.pdf|300px|thumb||Sphere function for n=3]]
|-
| [[Rosenbrock function]]
| [[File:Rosenbrock contour.svg|200px|Rosenbrock's function for n=2]]
|| <math>f(\boldsymbol{x}) = \sum_{i=1}^{n-1} \left[ 100 \left(x_{i+1} - x_{i}^{2}\right)^{2} + \left(x_{i} - 1\right)^{2}\right].\quad</math>
| <math>f(\boldsymbol{x}) = \sum_{i=1}^{n-1} \left[ 100 \left(x_{i+1} - x_{i}^{2}\right)^{2} + \left(1 - x_{i}\right)^{2}\right]</math>
|| <math>\text{Minimum} =
| <math>\text{Min} =
\begin{cases}
n=2 & \rightarrow \quad f(1,1) = 0, \\
n=3 & \rightarrow \quad f(1,1,1) = 0, \\
n>3 & \rightarrow \quad f\left(-1,\underbrace{1,\dots,1}_{(n-1) \text{ times}}\right) = 0. \\
\end{cases}
</math>
|| <math>-\infty \le x_{i} \le \infty</math>, <math>1 \le i \le n</math>
|-
|| [[File:Rosenbrock's function in 3D.pdf|300px|thumb||Rosenbrock's function for n=3]]
| [[Beale function]]
|}
| [[File:Beale contour.svg|200px|Beale's function]]
 
| <math>f(x,y) = \left( 1.5 - x + xy \right)^{2} + \left( 2.25 - x + xy^{2}\right)^{2}</math>
 
<math>+ \left(2.625 - x+ xy^{3}\right)^{2}</math>
 
| <math>f(3, 0.5) = 0</math>
* Sphere function:
| <math>-4.5 \le x,y \le 4.5</math>
 
|-
:: <math>f(\boldsymbol{x}) = \sum_{i=1}^{n} x_{i}^{2}.\quad</math>
| [[Goldstein–Price function]]
 
| [[File:Goldstein-Price contour.svg|200px|Goldstein–Price function]]
{{multiple image
| <math>f(x,y) = \left[1+\left(x+y+1\right)^{2}\left(19-14x+3x^{2}-14y+6xy+3y^{2}\right)\right]</math>
| width = 250
<math>\left[30+\left(2x-3y\right)^{2}\left(18-32x+12x^{2}+48y-36xy+27y^{2}\right)\right]</math>
| footer = Test function for single-objective optimization problems
| <math>f(0, -1) = 3</math>
| image1 = Sphere function in 3D.pdf
| <math>-2 \le x,y \le 2</math>
| caption1 = Sphere function in 3D.
|-
| image2 = Rosenbrock's function in 3D.pdf
| caption2 = Rosenbrock's[[Booth function in 3D.]]
| [[File:Booth contour.svg|200px|Booth's function]]
}}
|<math>f(x,y) = \left( x + 2y -7\right)^{2} + \left(2x +y - 5\right)^{2}</math>
 
|<math>f(1,3) = 0</math>
Minimum: <math>f(x_{1}, \dots, x_{n}) = f(0, \dots, 0) = 0</math>, for <math>-\infty \le x_{i} \le \infty</math>, <math>1 \le i \le n</math>.
|<math>-10 \le x,y \le 10</math>
 
|-
* [[Rosenbrock function]]:
| Bukin function N.6
 
| [[File:Bukin 6 contour.svg|200px|Bukin function N.6]]
:: <math>f(\boldsymbol{x}) = \sum_{i=1}^{n-1} \left[ 100 \left(x_{i+1} - x_{i}^{2}\right)^{2} + \left(x_{i} - 1\right)^{2}\right].\quad</math>
| <math>f(x,y) = 100\sqrt{\left|y - 0.01x^{2}\right|} + 0.01 \left|x+10 \right|.\quad</math>
 
::| <math>\text{Minimum}f(-10,1) = 0</math>
| <math>-15\le x \le -5</math>, <math>-3\le y \le 3</math>
|-
| [[Matyas function]]
| [[File:Matyas contour.svg|200px|Matyas function]]
| <math>f(x,y) = 0.26 \left( x^{2} + y^{2}\right) - 0.48 xy</math>
| <math>f(0,0) = 0</math>
| <math>-10\le x,y \le 10</math>
|-
| Lévi function N.13
|[[File:Levi13 contour.svg|200px|Lévi function N.13]]
| <math>f(x,y) = \sin^{2} 3\pi x + \left(x-1\right)^{2}\left(1+\sin^{2} 3\pi y\right)</math>
<math>+\left(y-1\right)^{2}\left(1+\sin^{2} 2\pi y\right)</math>
| <math>f(1,1) = 0</math>
| <math>-10\le x,y \le 10</math>
|-
| [[Griewank function]]
| [[File:Griewank 2D Contour.svg|200px|Griewank's function]]
| <math>f(\boldsymbol{x})= 1+ \frac {1}{4000} \sum _{i=1}^n x_i^2 -\prod _{i=1}^n P_i(x_i)</math>, where <math>P_i(x_i)=\cos \left( \frac {x_i}{\sqrt {i}} \right)</math>
|<math>f(0, \dots, 0) = 0</math>
|<math>-\infty \le x_{i} \le \infty</math>, <math>1 \le i \le n</math>
|-
| [[Himmelblau's function]]
|[[File:Himmelblau contour plot.svg|200px|Himmelblau's function]]
| <math>f(x, y) = (x^2+y-11)^2 + (x+y^2-7)^2.\quad</math>
| <math>\text{Min} =
\begin{cases}
n=2 & \rightarrow \quad f\left(13.0,1 2.0\right) & = 0,.0 \\
n=3 & \rightarrow \quad f\left(1-2.805118,1,1 3.131312\right) & = 0,.0 \\
n>3 & \rightarrow \quad f\left(-13.779310,\underbrace{1,\dots,1}_{(n -1) \text{ times}}3.283186\right) & = 0.0 \\
f\left(3.584428, -1.848126\right) & = 0.0 \\
\end{cases}
</math>
| <math>-5\le x,y \le 5</math>
 
|-
: for <math>-\infty \le x_{i} \le \infty</math>, <math>1 \le i \le n</math>.
| Three-hump camel function
 
| [[File:Three-hump-camel contour.svg|200px|Three Hump Camel function]]
* Beale's function
| <math>f(x,y) = 2x^{2} - 1.05x^{4} + \frac{x^{6}}{6} + xy + y^{2}</math>
 
| <math>f(0,0) = 0</math>
:: <math>f(x,y) = \left( 1.5 - x + xy \right)^{2} + \left( 2.25 - x + xy^{2}\right)^{2} + \left(2.625 - x+ xy^{3}\right)^{2}.\quad</math>
| <math>-5\le x,y \le 5</math>
 
|-
: Minimum: <math>f(3, 0.5) = 0</math>, for <math>-4.5 \le x,y \le 4.5</math>.
| [[Easom function]]
 
*| Goldstein–Price[[File:Easom contour.svg|200px|Easom function:]]
| <math>f(x,y) = -\cos \left(x\right)\cos \left(y\right) \exp\left(-\left(\left(x-\pi\right)^{2} + \left(y-\pi\right)^{2}\right)\right)</math>
 
| <math>f(\pi , \pi) = -1</math>
:: <math>f(x,y) = \left(1+\left(x+y+1\right)^{2}\left(19-14x+3x^{2}-14y+6xy+3y^{2}\right)\right)\left(30+\left(2x-3y\right)^{2}\left(18-32x+12x^{2}+48y-36xy+27y^{2}\right)\right).\quad</math>
| <math>-100\le x,y \le 100</math>
 
|-
: Minimum: <math>f(0, -1) = 3</math>, for <math>-2 \le x,y \le 2</math>.
| Cross-in-tray function
 
| [[File:Cross-in-tray contour.svg|200px|Cross-in-tray function]]
* Booth's function:
| <math>f(x,y) = -0.0001 \left[ \left| \sin x \sin y \exp \left(\left|100 - \frac{\sqrt{x^{2} + y^{2}}}{\pi} \right|\right)\right| + 1 \right]^{0.1}</math>
 
| <math>\text{Min} =
:: <math>f(x,y) = \left( x + 2y -7\right)^{2} + \left(2x +y - 5\right)^{2}.\quad</math>
 
: Minimum: <math>f(1,3) = 0</math>, for <math>-10 \le x,y \le 10</math>.
 
{{multiple image
| align = center
| width = 300
| footer = Test function for single-objective optimization problems
| image1 = Beale's function.pdf
| caption1 = Beale's function.
| image2 = Goldstein Price function.pdf
| caption2 = Goldstein Price function.
| image3 = Booth's function.pdf
| caption3 = Booth's function.
}}
 
* Bukin function N. 6:
 
:: <math>f(x,y) = 100\sqrt{\left|y - 0.01x^{2}\right|} + 0.01 \left|x+10 \right|.\quad</math>
 
: Minimum: <math>f(-10,1) = 0</math>, for <math>-15\le x \le -5</math>, <math>-3\le y \le 3</math>.
 
* Ackley's function:
 
:: <math>f(x,y) = -20\exp\left(-0.2\sqrt{0.5\left(x^{2}+y^{2}\right)}\right)-\exp\left(0.5\left(\cos\left(2\pi x\right)+\cos\left(2\pi y\right)\right)\right) + 20 + e.\quad</math>
 
: Minimum: <math>f(0,0) = 0</math>, for <math>-5\le x,y \le 5</math>.
 
* Matyas function:
 
:: <math>f(x,y) = 0.26 \left( x^{2} + y^{2}\right) - 0.48 xy.\quad</math>
 
: Minimum: <math>f(0,0) = 0</math>, for <math>-10\le x,y \le 10</math>.
 
{{multiple image
| align = center
| width = 300
| footer = Test function for single-objective optimization problems
| image1 = Bukin function 6.pdf
| caption1 = Bukin function N. 6.
| image2 = Ackley's function.pdf
| caption2 = Ackley's function.
| image3 = Matyas function.pdf
| caption3 = Matyas function.
}}
 
* Lévi function N. 13:
 
:: <math>f(x,y) = \sin^{2}\left(3\pi x\right)+\left(x-1\right)^{2}\left(1+\sin^{2}\left(3\pi y\right)\right)+\left(y-1\right)^{2}\left(1+\sin^{2}\left(2\pi y\right)\right).\quad</math>
 
: Minimum: <math>f(1,1) = 0</math>, for <math>-10\le x,y \le 10</math>.
 
* Three-hump camel function:
 
:: <math>f(x,y) = 2x^{2} - 1.05x^{4} + \frac{x^{6}}{6} + xy + y^{2}.\quad</math>
 
: Minimum: <math>f(0,0) = 0</math>, for <math>-5\le x,y \le 5</math>.
 
* Easom function:
 
:: <math>f(x,y) = -\cos \left(x\right)\cos \left(y\right) \exp\left(-\left(\left(x-\pi\right)^{2} + \left(y-\pi\right)^{2}\right)\right).\quad</math>
 
: Minimum: <math>f(\pi , \pi) = -1</math>, for <math>-100\le x,y \le 100</math>.
 
{{multiple image
| align = center
| width = 300
| footer = Test function for single-objective optimization problems
| image1 = Levi function 13.pdf
| caption1 = Lévi function N. 13.
| image2 = Three Hump Camel function.pdf
| caption2 = Three Hump Camel function.
| image3 = Easom function.pdf
| caption3 = Easom function.
}}
 
* Cross-in-tray function:
 
:: <math>f(x,y) = -0.0001 \left( \left| \sin \left(x\right) \sin \left(y\right) \exp \left( \left|100 - \frac{\sqrt{x^{2} + y^{2}}}{\pi} \right|\right)\right| + 1 \right)^{0.1}.\quad</math>
 
{{multiple image
| width = 230
| footer = Test function for single-objective optimization problems
| image1 = Cross-in-tray function.pdf
| caption1 = Cross-in-tray function.
| image2 = Eggholder function.pdf
| caption2 = Eggholder function.
}}
 
:: <math>\text{Minima} =
\begin{cases}
f\left(1.34941, -1.34941\right) & = -2.06261 \\
Line 180 ⟶ 132:
\end{cases}
</math>
|<math>-10\le x,y \le 10</math>
 
|-
: for <math>-10\le x,y \le 10</math>.
| [[Eggholder function]]<ref name="Whitley Rana Dzubera Mathias 1996 pp. 245–276">{{cite journal | last1=Whitley | first1=Darrell | last2=Rana | first2=Soraya | last3=Dzubera | first3=John | last4=Mathias | first4=Keith E. | title=Evaluating evolutionary algorithms | journal=Artificial Intelligence | publisher=Elsevier BV | volume=85 | issue=1–2 | year=1996 | issn=0004-3702 | doi=10.1016/0004-3702(95)00124-7 | pages=264| doi-access=free }}</ref><ref name="vanaret2015hybridation">Vanaret C. (2015) [https://www.researchgate.net/publication/337947149_Hybridization_of_interval_methods_and_evolutionary_algorithms_for_solving_difficult_optimization_problems Hybridization of interval methods and evolutionary algorithms for solving difficult optimization problems.] PhD thesis. Ecole Nationale de l'Aviation Civile. Institut National Polytechnique de Toulouse, France.</ref>
 
*| [[File:Eggholder contour.svg|200px|Eggholder function:]]
| <math>f(x,y) = - \left(y+47\right) \sin \sqrt{\left|\frac{x}{2}+\left(y+47\right)\right|} - x \sin \sqrt{\left|x - \left(y + 47 \right)\right|}</math>
 
| <math>f(512, 404.2319) = -959.6407</math>
:: <math>f(x,y) = - \left(y+47\right) \sin \left(\sqrt{\left|y + \frac{x}{2}+47\right|}\right) - x \sin \left(\sqrt{\left|x - \left(y + 47 \right)\right|}\right).\quad</math>
| <math>-512\le x,y \le 512</math>
 
|-
: Minimum: <math>f(512, 404.2319) = -959.6407</math>, for <math>-512\le x,y \le 512</math>.
| [[Hölder table function]]
 
*| Hölder[[File:Hoelder table contour.svg|200px|Holder table function:]]
| <math>f(x,y) = - \left|\sin x \cos y \exp \left(\left|1 - \frac{\sqrt{x^{2} + y^{2}}}{\pi} \right|\right)\right|</math>
 
| <math>\text{Min} =
{{multiple image
| width = 230
| footer = Test function for single-objective optimization problems
| image1 = Holder table function.pdf
| caption1 = Hölder table function.
| image2 = McCormick function.pdf
| caption2 = McCormick function.
}}
 
:: <math>f(x,y) = - \left|\sin \left(x\right) \cos \left(y\right) \exp \left(\left|1 - \frac{\sqrt{x^{2} + y^{2}}}{\pi} \right|\right)\right|.\quad</math>
 
::<math>\text{Minima} =
\begin{cases}
f\left(8.05502, 9.66459\right) & = -19.2085 \\
f\left(-8.05502, 9.66459\right) & = -19.2085 \\
f\left(-8.05502,-9.66459\right) & = -19.2085 \\
f\left(-8.05502,-9.66459\right) & = -19.2085
\end{cases}
</math>
| <math>-10\le x,y \le 10</math>
|-
| [[McCormick function]]
| [[File:McCormick contour.svg|200px|McCormick function]]
| <math>f(x,y) = \sin \left(x+y\right) + \left(x-y\right)^{2} - 1.5x + 2.5y + 1</math>
| <math>f(-0.54719,-1.54719) = -1.9133</math>
| <math>-1.5\le x \le 4</math>, <math>-3\le y \le 4</math>
|-
| Schaffer function N. 2
| [[File:Schaffer2 contour.svg|200px|Schaffer function N.2]]
| <math>f(x,y) = 0.5 + \frac{\sin^{2}\left(x^{2} - y^{2}\right) - 0.5}{\left[1 + 0.001\left(x^{2} + y^{2}\right) \right]^{2}}</math>
| <math>f(0, 0) = 0</math>
| <math>-100\le x,y \le 100</math>
|-
| Schaffer function N. 4
| [[File:Schaffer4 contour.svg|200px|Schaffer function N.4]]
| <math>f(x,y) = 0.5 + \frac{\cos^{2}\left[\sin \left( \left|x^{2} - y^{2}\right|\right)\right] - 0.5}{\left[1 + 0.001\left(x^{2} + y^{2}\right) \right]^{2}}</math>
| <math>\text{Min} =
\begin{cases}
f\left(0,1.25313\right) & = 0.292579 \\
f\left(0,-1.25313\right) & = 0.292579 \\
f\left(1.25313,0\right) & = 0.292579 \\
f\left(-1.25313,0\right) & = 0.292579
\end{cases}
</math>
| <math>-100\le x,y \le 100</math>
|-
| [[Styblinski–Tang function]]
| [[File:Styblinski-Tang contour.svg|200px|Styblinski-Tang function]]
| <math>f(\boldsymbol{x}) = \frac{\sum_{i=1}^{n} x_{i}^{4} - 16x_{i}^{2} + 5x_{i}}{2}</math>
| <math>-39.16617n < f(\underbrace{-2.903534, \ldots, -2.903534}_{n \text{ times}} ) < -39.16616n</math>
| <math>-5\le x_{i} \le 5</math>, <math>1\le i \le n</math>..
|-
| [[Shekel function]]
| [[Image:Shekel_2D.jpg|200px|A Shekel function in 2 dimensions and with 10 maxima]]
| <math>
f(\boldsymbol{x}) = \sum_{i = 1}^{m} \; \left( c_{i} + \sum\limits_{j = 1}^{n} (x_{j} - a_{ji})^2 \right)^{-1}
</math>
|
| <math>-\infty \le x_{i} \le \infty</math>, <math>1 \le i \le n</math>
|}
 
==Test functions for constrained optimization==
: for <math>-10\le x,y \le 10</math>.
 
{| class="wikitable" style="text-align:center"
* McCormick function:
|-
! Name !! Plot !! Formula !! Global minimum !! Search ___domain
|-
| Rosenbrock function constrained to a disk<ref>{{Cite web|url=https://www.mathworks.com/help/optim/ug/example-nonlinear-constrained-minimization.html?requestedDomain=www.mathworks.com|title=Solve a Constrained Nonlinear Problem - MATLAB & Simulink|website=www.mathworks.com|access-date=2017-08-29}}</ref>
|| [[File:Rosenbrock circle constraint.svg|200px|Rosenbrock function constrained to a disk]]
|| <math>f(x,y) = (1-x)^2 + 100(y-x^2)^2</math>,
 
:subjected to: <math>f(x,y) = \sin \left(x+y\right)^2 + \left(x-y\right)^{2} - 1.5x +\le 2.5y + 1.\quad</math>
|| <math>f(1.0,1.0) = 0</math>
|| <math>-1.5\le x \le 1.5</math>, <math>-1.5\le y \le 1.5</math>
|-
| Mishra's Bird function - constrained<ref>{{Cite web|url=http://www.phoenix-int.com/software/benchmark_report/bird_constrained.php|title=Bird Problem (Constrained) {{!}} Phoenix Integration|access-date=2017-08-29|url-status=bot: unknown|archive-url=https://web.archive.org/web/20161229032528/http://www.phoenix-int.com/software/benchmark_report/bird_constrained.php|archive-date=2016-12-29}}</ref><ref>{{Cite journal|last=Mishra|first=Sudhanshu|date=2006|title=Some new test functions for global optimization and performance of repulsive particle swarm method|url=https://mpra.ub.uni-muenchen.de/2718/|journal=MPRA Paper}}</ref>
|| [[File:Mishra bird contour.svg|200px|Bird function (constrained)]]
|| <math>f(x,y) = \sin(y) e^{\left [(1-\cos x)^2\right]} + \cos(x) e^{\left [(1-\sin y)^2 \right]} + (x-y)^2</math>,
subjected to: <math> (x+5)^2 + (y+5)^2 < 25 </math>
|| <math>f(-3.1302468,-1.5821422) = -106.7645367</math>
|| <math>-10\le x \le 0</math>, <math>-6.5\le y \le 0</math>
|-
| Townsend function (modified)<ref>{{Cite web|url=http://www.chebfun.org/examples/opt/ConstrainedOptimization.html|title=Constrained optimization in Chebfun|last=Townsend|first=Alex|date=January 2014|website=chebfun.org|access-date=2017-08-29}}</ref>
|| [[File:Townsend contour.svg|200px|Heart constrained multimodal function]]
|| <math>f(x,y) = -[\cos((x-0.1)y)]^2 - x \sin(3x+y)</math>,
subjected to:<math>x^2+y^2 < \left[2\cos t - \frac 1 2 \cos 2t - \frac 1 4 \cos 3t - \frac 1 8 \cos 4t\right]^2 + [2\sin t]^2 </math>
where: {{Math|1=''t'' = Atan2(x,y)}}
|| <math>f(2.0052938,1.1944509) = -2.0239884</math>
|| <math>-2.25\le x \le 2.25</math>, <math>-2.5\le y \le 1.75</math>
 
|-
: Minimum: <math>f(-0.54719,-1.54719) = -1.9133</math>, for <math>-1.5\le x \le 4</math>, <math>-3\le y \le 4</math>.
| '''Keane's bump function'''{{anchor|Keane's bump function}}<ref>{{cite journal |last1=Mishra |first1=Sudhanshu |title=Minimization of Keane’s Bump Function by the Repulsive Particle Swarm and the Differential Evolution Methods |date=5 May 2007 |url=https://econpapers.repec.org/paper/pramprapa/3098.htm |journal=MPRA Paper|publisher=University Library of Munich, Germany}}</ref>
|| [[File:Estimation of Distribution Algorithm animation.gif|200px|Keane's bump function]]
|| <math>f(\boldsymbol{x}) = -\left| \frac{\left[ \sum_{i=1}^m \cos^4 (x_i) - 2 \prod_{i=1}^m \cos^2 (x_i) \right]}{{\left( \sum_{i=1}^m ix^2_i \right)}^{0.5}} \right| </math>,
subjected to: <math> 0.75 - \prod_{i=1}^m x_i < 0 </math>, and
<math> \sum_{i=1}^m x_i - 7.5m < 0 </math>
|| <math>f((1.60025376,0.468675907)) = -0.364979746</math>
|| <math>0 < x_i < 10</math>
|}
 
==Test functions for multi-objective optimization==
* Schaffer function N. 2:
 
{{explain|reason=What does it mean to minimize two objective functions?|date=September 2016}}
:: <math>f(x,y) = 0.5 + \frac{\sin^{2}\left(x^{2} - y^{2}\right) - 0.5}{\left(1 + 0.001\left(x^{2} + y^{2}\right) \right)^{2}}.\quad</math>
 
{| class="wikitable" style="text-align:center"
:Minimum: <math>f(0, 0) = 0</math>, for <math>-100\le x,y \le 100</math>.
|-
 
! Name !! Plot !! Functions !! Constraints !! Search ___domain
* Schaffer function N. 4:
|-
 
| [[Binh and Korn function]]:<ref name="Binh97"/>
:: <math>f(x,y) = 0.5 + \frac{\cos\left(\sin \left( \left|x^{2} - y^{2}\right|\right)\right) - 0.5}{\left(1 + 0.001\left(x^{2} + y^{2}\right) \right)^{2}}.\quad</math>
|| [[File:Binh and Korn function.pdf|200px|Binh and Korn function]]
 
|| <math>\text{Minimize} =
: Minimum: <math>f(0,1.25313) = 0.292579</math>, for <math>-100\le x,y \le 100</math>.
 
* Styblinski–Tang function:
 
:: <math>f(\boldsymbol{x}) = \frac{\sum_{i=1}^{n} x_{i}^{4} - 16x_{i}^{2} + 5x_{i}}{2}.\quad</math>
 
: Minimum: <math>f\left(\underbrace{-2.903534, \ldots, -2.903534}_{(n) \text{ times}} \right) = -39.16599n</math>, for <math>-5\le x_{i} \le 5</math>, <math>1\le i \le n</math>.
 
{{multiple image
| align = center
| width = 300
| footer = Test function for single-objective optimization problems
| image1 = Schaffer function 2.pdf
| caption1 = Schaffer function N. 2.
| image2 = Schaffer function 4.pdf
| caption2 = Schaffer function N. 4.
| image3 = Styblinski-Tang function.pdf
| caption3 = Styblinski-Tang function.
}}
 
==Test functions for multi-objective optimization problems==
 
* Binh and Korn function:
 
{{multiple image
| width = 270
| footer = Test function for multi-objective optimization problems
| image1 = Binh and Korn function.pdf
| caption1 = Binh and Korn function.
| image2 = Chakong and Haimes function.pdf
| caption2 = Chakong and Haimes function.
}}
 
:: <math>\text{Minimize} =
\begin{cases}
f_{1}\left(x,y\right) & = 4x^{2} + 4y^{2} \\
f_{2}\left(x,y\right) & = \left(x - 5\right)^{2} + \left(y - 5\right)^{2} \\
\end{cases}
</math>
||<math>\text{s.t.} =
 
:: <math>\text{s.t.} =
\begin{cases}
g_{1}\left(x,y\right) & = \left(x - 5\right)^{2} + y^{2} \leq 25 \\
g_{2}\left(x,y\right) & = \left(x - 8\right)^{2} + \left(y + 3\right)^{2} \geq 7.7 \\
\end{cases}
</math>
|| <math>0\le x \le 5</math>, <math>0\le y \le 3</math>
 
|-
: for <math>0\le x \le 5</math>, <math>0\le y \le 3</math>.
| [[Chankong and Haimes function]]:<ref>{{cite book |last1=Chankong |first1=Vira |last2=Haimes |first2=Yacov Y. |title=Multiobjective decision making. Theory and methodology. |isbn=0-444-00710-5|year=1983 |publisher=North Holland }}</ref>
 
*|| [[File:Chakong and Haimes function:.pdf|200px|Chakong and Haimes function]]
|| <math>\text{Minimize} =
 
::<math>\text{Minimize} =
\begin{cases}
f_{1}\left(x,y\right) & = 2 + \left(x-2\right)^{2} + \left(y-1\right)^{2} \\
f_{2}\left(x,y\right) & = 9x +- \left(y - 1\right)^{2} \\
\end{cases}
</math>
|| <math>\text{s.t.} =
 
::<math>\text{s.t.} =
\begin{cases}
g_{1}\left(x,y\right) & = x^{2} + y^{2} \leq 225 \\
g_{2}\left(x,y\right) & = x - 3y + 10 \leq 0 \\
\end{cases}
</math>
|| <math>-20\le x,y \le 20</math>
 
|-
:for <math>-20\le x,y \le 20</math>.
| [[Fonseca–Fleming function]]:<ref name="FonzecaFleming:1995">{{cite journal |first1=C. M. |last1=Fonseca |first2=P. J. |last2=Fleming |title=An Overview of Evolutionary Algorithms in Multiobjective Optimization |journal=[[Evolutionary Computation (journal)|Evol Comput]] |volume=3 |issue=1 |pages=1–16 |year=1995 |doi=10.1162/evco.1995.3.1.1 |citeseerx=10.1.1.50.7779 |s2cid=8530790 }}</ref>
 
*|| [[File:Fonseca and Fleming function:.pdf|200px|Fonseca and Fleming function]]
|| <math>\text{Minimize} =
 
{{multiple image
| width = 270
| footer = Test function for multi-objective optimization problems
| image1 = Fonseca and Fleming function.pdf
| caption1 = Fonseca and Fleming function.
| image2 = Test function 4 - Binh.pdf
| caption2 = Test function 4.<ref name="Binh99"/>
}}
 
:: <math>\text{Minimize} =
\begin{cases}
f_{1}\left(\boldsymbol{x}\right) & = 1 - \exp \left([-\sum_{i=1}^{n} \left(x_{i} - \frac{1}{\sqrt{n}} \right)^{2} \right)] \\
f_{2}\left(\boldsymbol{x}\right) & = 1 - \exp \left([-\sum_{i=1}^{n} \left(x_{i} + \frac{1}{\sqrt{n}} \right)^{2} \right)] \\
\end{cases}
</math>
||
 
: for|| <math>-4\le x_{i} \le 4</math>, <math>1\le i \le n</math>.
|-
 
*| Test function 4:<ref name="Binh99"/>:
|| [[File:Test function 4 - Binh.pdf|200px|Test function 4.<ref name="Binh99" />]]
 
::|| <math>\text{Minimize} =
\begin{cases}
f_{1}\left(x,y\right) & = x^{2} - y \\
f_{2}\left(x,y\right) & = -0.5x - y - 1 \\
\end{cases}
</math>
|| <math>\text{s.t.} =
 
:: <math>\text{s.t.} =
\begin{cases}
g_{1}\left(x,y\right) & = 6.5 - \frac{x}{6} - y \geq 0 \\
g_{2}\left(x,y\right) & = 7.5 - 0.5x - y \geq 0 \\
g_{3}\left(x,y\right) & = 30 - 5x - y \geq 0 \\
\end{cases}
</math>
|| <math>-7\le x,y \le 4</math>
 
|-
: for <math>-7\le x,y \le 4</math>.
| [[Kursawe function]]:<ref name="Kursawe:1991">F. Kursawe, “[http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.47.8050&rep=rep1&type=pdf A variant of evolution strategies for vector optimization],” in [[Parallel Problem Solving from Nature|PPSN]] I, Vol 496 Lect Notes in Comput Sc. Springer-Verlag, 1991, pp.&nbsp;193–197.</ref>
 
|| [[File:Kursawe function.pdf|200px|Kursawe function]]
{{multiple image
|| <math>\text{Minimize} =
| width = 270
| footer = Test function for multi-objective optimization problems
| image1 = Kursawe function.pdf
| caption1 = Kursawe function.
| image2 = Schaffer function 1.pdf
| caption2 = Schaffer function N. 1.
}}
 
* Kursawe function:
 
:: <math>\text{Minimize} =
\begin{cases}
f_{1}\left(\boldsymbol{x}\right) & = \sum_{i=1}^{2} \left[-10 \exp \left(-0.2 \sqrt{x_{i}^{2} + x_{i+1}^{2}} \right) \right] \\
& \\
f_{2}\left(\boldsymbol{x}\right) & = \sum_{i=1}^{3} \left[\left|x_{i}\right|^{0.8} + 5 \sin \left(x_{i}^{3} \right) \right] \\
\end{cases}
</math>
||
 
: for ||<math>-5\le x_{i} \le 5</math>, <math>1\le i \le 3</math>.
|-
 
| Schaffer function N. 1:<ref name="Schaffer:1984">{{cite book |last=Schaffer |first=J. David |date=1984 |chapter=Multiple Objective Optimization with Vector Evaluated Genetic Algorithms |title=Proceedings of the First International Conference on Genetic Algorithms |editor1=G.J.E Grefensette |editor2=J.J. Lawrence Erlbraum |oclc=20004572 }}</ref>
* Schaffer function N. 1:
|| [[File:Schaffer function 1.pdf|200px|Schaffer function N.1]]
 
::|| <math>\text{Minimize} =
\begin{cases}
f_{1}\left(x\right) & = x^{2} \\
f_{2}\left(x\right) & = \left(x-2\right)^{2} \\
\end{cases}
</math>
||
 
: for|| <math>-A\le x \le A</math>. Values of <math>A</math> formfrom <math>10</math> to <math>10^{5}</math> have been used successfully. Higher values of <math>A</math> increase the difficulty of the problem.
|-
 
*| Schaffer function N. 2:
|| [[File:Schaffer function 2 - multi-objective.pdf|200px|Schaffer function N.2]]
 
|| <math>\text{Minimize} =
{{multiple image
| width = 240
| footer = Test function for multi-objective optimization problems
| image1 = Schaffer function 2 - multi-objective.pdf
| caption1 = Schaffer function N. 2.
| image2 = Poloni's two objective function.pdf
| caption2 = Poloni's two objective function.
}}
 
:: <math>\text{Minimize} =
\begin{cases}
f_{1}\left(x\right) & = \begin{cases}
-x, & \text{if } x \le 1 \\
x-2, & \text{if } 1 < x \le 3 \\
Line 386 ⟶ 333:
x-4, & \text{if } x > 4 \\
\end{cases} \\
f_{2}\left(x\right) & = \left(x-5\right)^{2} \\
\end{cases}
</math>
||
 
: for ||<math>-5\le x \le 10</math>.
|-
 
*| Poloni's two objective function:
|| [[File:Poloni's two objective function.pdf|200px|Poloni's two objective function]]
 
::|| <math>\text{Minimize} =
\begin{cases}
f_{1}\left(x,y\right) & = \left[1 + \left(A_{1} - B_{1}\left(x,y\right) \right)^{2} + \left(A_{2} - B_{2}\left(x,y\right) \right)^{2} \right] \\
f_{2}\left(x,y\right) & = \left(x + 3\right)^{2} + \left(y + 1 \right)^{2} \\
\end{cases}
</math>
<math>\text{where} =
 
:: <math>\text{where} =
\begin{cases}
A_{1} & = 0.5 \sin \left(1\right) - 2 \cos \left(1\right) + \sin \left(2\right) - 1.5 \cos \left(2\right) \\
A_{2} & = 1.5 \sin \left(1\right) - \cos \left(1\right) + 2 \sin \left(2\right) - 0.5 \cos \left(2\right) \\
B_{1}\left(x,y\right) & = 0.5 \sin \left(x\right) - 2 \cos \left(x\right) + \sin \left(y\right) - 1.5 \cos \left(y\right) \\
B_{2}\left(x,y\right) & = 1.5 \sin \left(x\right) - \cos \left(x\right) + 2 \sin \left(y\right) - 0.5 \cos \left(y\right)
\end{cases}
</math>
||
 
: for ||<math>-\pi\le x,y \le \pi</math>.
|-
 
| Zitzler–Deb–Thiele's function N. 1:<ref name="Debetal2002testpr">{{cite book |last1=Deb |first1=Kalyan |last2=Thiele |first2=L. |last3=Laumanns |first3=Marco |last4=Zitzler |first4=Eckart |title=Proceedings of the 2002 Congress on Evolutionary Computation. CEC'02 (Cat. No.02TH8600) |chapter=Scalable multi-objective optimization test problems |date=2002 |volume=1 |pages=825–830 |doi=10.1109/CEC.2002.1007032|isbn=0-7803-7282-4 |s2cid=61001583 }}</ref>
* Zitzler–Deb–Thiele's function N. 1:
|| [[File:Zitzler-Deb-Thiele's function 1.pdf|200px|Zitzler-Deb-Thiele's function N.1]]
 
|| <math>\text{Minimize} =
{{multiple image
| width = 270
| footer = Test function for multi-objective optimization problems
| image1 = Zitzler-Deb-Thiele's function 1.pdf
| caption1 = Zitzler-Deb-Thiele's function N. 1.
| image2 = Zitzler-Deb-Thiele's function 2.pdf
| caption2 = Zitzler-Deb-Thiele's function N. 2.
}}
 
:: <math>\text{Minimize} =
\begin{cases}
f_{1}\left(\boldsymbol{x}\right) & = x_{1} \\
f_{2}\left(\boldsymbol{x}\right) & = g\left(\boldsymbol{x}\right) h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right) \\
g\left(\boldsymbol{x}\right) & = 1 + \frac{9}{29} \sum_{i=2}^{30} x_{i} \\
h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right) & = 1 - \sqrt{\frac{f_{1}\left(\boldsymbol{x}\right)}{g\left(\boldsymbol{x}\right)}} \\
\end{cases}
</math>
||
 
: for ||<math>0\le x_{i} \le 1</math>, <math>1\le i \le 30</math>.
|-
 
*| Zitzler–Deb–Thiele's function N. 2:<ref name="Debetal2002testpr" />
|| [[File:Zitzler-Deb-Thiele's function 2.pdf|200px|Zitzler-Deb-Thiele's function N.2]]
 
::|| <math>\text{Minimize} =
\begin{cases}
f_{1}\left(\boldsymbol{x}\right) & = x_{1} \\
f_{2}\left(\boldsymbol{x}\right) & = g\left(\boldsymbol{x}\right) h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right) \\
g\left(\boldsymbol{x}\right) & = 1 + \frac{9}{29} \sum_{i=2}^{30} x_{i} \\
h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right) & = 1 - \left(\frac{f_{1}\left(\boldsymbol{x}\right)}{g\left(\boldsymbol{x}\right)}\right)^{2} \\
\end{cases}
</math>
||
 
: for|| <math>0\le x_{i} \le 1</math>, <math>1\le i \le 30</math>.
|-
 
*| Zitzler–Deb–Thiele's function N. 3:<ref name="Debetal2002testpr" />
|| [[File:Zitzler-Deb-Thiele's function 3.pdf|200px|Zitzler-Deb-Thiele's function N.3]]
 
:: ||<math>\text{Minimize} =
\begin{cases}
f_{1}\left(\boldsymbol{x}\right) & = x_{1} \\
f_{2}\left(\boldsymbol{x}\right) & = g\left(\boldsymbol{x}\right) h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right) \\
g\left(\boldsymbol{x}\right) & = 1 + \frac{9}{29} \sum_{i=2}^{30} x_{i} \\
h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right) & = 1 - \sqrt{\frac{f_{1}\left(\boldsymbol{x}\right)}{g\left(\boldsymbol{x} \right)}} - \left(\frac{f_{1}\left(\boldsymbol{x}\right)}{g\left(\boldsymbol{x}\right)} \right) \sin \left(10 \pi f_{1} \left(\boldsymbol{x} \right) \right)
\end{cases}
</math>
||
 
: for ||<math>0\le x_{i} \le 1</math>, <math>1\le i \le 30</math>.
|-
 
*| Zitzler–Deb–Thiele's function N. 4:<ref name="Debetal2002testpr" />
|| [[File:Zitzler-Deb-Thiele's function 4.pdf|200px|Zitzler-Deb-Thiele's function N.4]]
 
::|| <math>\text{Minimize} =
\begin{cases}
f_{1}\left(\boldsymbol{x}\right) & = x_{1} \\
f_{2}\left(\boldsymbol{x}\right) & = g\left(\boldsymbol{x}\right) h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right) \\
g\left(\boldsymbol{x}\right) & = 91 + \sum_{i=2}^{10} \left(x_{i}^{2} - 10 \cos \left(4 \pi x_{i}\right) \right) \\
h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right) & = 1 - \sqrt{\frac{f_{1}\left(\boldsymbol{x}\right)}{g\left(\boldsymbol{x} \right)}}
\end{cases}
</math>
||
 
: for ||<math>0\le x_{1} \le 1</math>, <math>-5\le x_{i} \le 5</math>, <math>2\le i \le 10</math>.
|-
 
*| Zitzler–Deb–Thiele's function N. 6:<ref name="Debetal2002testpr" />
|| [[File:Zitzler-Deb-Thiele's function 6.pdf|200px|Zitzler-Deb-Thiele's function N.6]]
 
:: ||<math>\text{Minimize} =
\begin{cases}
f_{1}\left(\boldsymbol{x}\right) & = 1 - \exp \left(-4x_{1}\right)\sin^{6}\left(6 \pi x_{1} \right) \\
f_{2}\left(\boldsymbol{x}\right) & = g\left(\boldsymbol{x}\right) h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right) \\
g\left(\boldsymbol{x}\right) & = 1 + 9 \left[\frac{\sum_{i=2}^{10} x_{i}}{9}\right]^{0.25} \\
h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right) & = 1 - \left(\frac{f_{1}\left(\boldsymbol{x}\right)}{g\left(\boldsymbol{x} \right)}\right)^{2} \\
\end{cases}
</math>
||
 
: for ||<math>0\le x_{i} \le 1</math>, <math>1\le i \le 10</math>.
|-
 
| Osyczka and Kundu function:<ref name="OsyczkaKundu1995">{{cite journal |last1=Osyczka |first1=A. |last2=Kundu |first2=S. |title=A new method to solve generalized multicriteria optimization problems using the simple genetic algorithm |journal=Structural Optimization |date=1 October 1995 |volume=10 |issue=2 |pages=94–99 |doi=10.1007/BF01743536 |s2cid=123433499 |issn=1615-1488}}</ref>
{{multiple image
|| [[File:Osyczka and Kundu function.pdf|200px|Osyczka and Kundu function]]
| align = center
||<math>\text{Minimize} =
| width = 300
| footer = Test function for multi-objective optimization problems
| image1 = Zitzler-Deb-Thiele's function 3.pdf
| caption1 = Zitzler-Deb-Thiele's function N. 3.
| image2 = Zitzler-Deb-Thiele's function 4.pdf
| caption2 = Zitzler-Deb-Thiele's function N. 4.
| image3 = Zitzler-Deb-Thiele's function 6.pdf
| caption3 = Zitzler-Deb-Thiele's function N. 6.
}}
 
{{multiple image
| direction = vertical
| width = 270
| footer = Test function for multi-objective optimization problems
| image1 = Viennet function.pdf
| caption1 = Viennet function.
| image2 = Osyczka and Kundu function.pdf
| caption2 = Osyczka and Kundu function.
}}
 
* Viennet function:
 
:: <math>\text{Minimize} =
\begin{cases}
f_{1}\left(\boldsymbol{x,y}\right) & = 0.5-25 \left(xx_{1}-2\right)^{2} +- y^\left(x_{2}-2\right) + \sin\left(x^{2} +- y^\left(x_{23} -1\right) \\^{2}
- f_{2}\left(x,y\right) & = \fracx_{\left(3x 4}- 2y + 4\right)^{2}}{8} +- \frac{\left(x x_{5}- y + 1\right)^{2}}{27} + 15 \\
f_{32}\left(\boldsymbol{x,y}\right) & = \fracsum_{i=1}{x^{26} + y^x_{2i} + 1} - 1.1 \exp \left(- \left(x^{2} + y^{2} \right) \right) \\
\end{cases}
</math>
||<math>\text{s.t.} =
 
: for <math>-3\le x,y \le 3</math>.
 
* Osyczka and Kundu function:
 
:: <math>\text{Minimize} =
\begin{cases}
f_g_{1}\left(\boldsymbol{x}\right) & = -25 \left(x_{1}-2\right)^{2} -+ \left(x_{2}-2\right)^{2} - \left(x_{3}-1\right)^{2} - \left(x_{4}-4\right)^{2}geq - \left(x_{5}-1\right)^{2}0 \\
f_g_{2}\left(\boldsymbol{x}\right) & = \sum_6 - x_{i=1}^{6} - x_{i}^{2} \geq 0 \\
g_{3}\left(\boldsymbol{x}\right) = 2 - x_{2} + x_{1} \geq 0 \\
g_{4}\left(\boldsymbol{x}\right) = 2 - x_{1} + 3x_{2} \geq 0 \\
g_{5}\left(\boldsymbol{x}\right) = 4 - \left(x_{3}-3\right)^{2} - x_{4} \geq 0 \\
g_{6}\left(\boldsymbol{x}\right) = \left(x_{5} - 3\right)^{2} + x_{6} - 4 \geq 0
\end{cases}
</math>
|| <math>0\le x_{1},x_{2},x_{6} \le 10</math>, <math>1\le x_{3},x_{5} \le 5</math>, <math>0\le x_{4} \le 6</math>.
 
|-
:: <math>\text{s.t.} =
| CTP1 function (2 variables):<ref name="Deb:2002"/><ref name="Jimenezetal2002">{{cite book |last1=Jimenez |first1=F. |last2=Gomez-Skarmeta |first2=A. F. |last3=Sanchez |first3=G. |last4=Deb |first4=K. |title=Proceedings of the 2002 Congress on Evolutionary Computation. CEC'02 (Cat. No.02TH8600) |chapter=An evolutionary algorithm for constrained multi-objective optimization |date=May 2002 |volume=2 |pages=1133–1138 |doi=10.1109/CEC.2002.1004402|isbn=0-7803-7282-4 |s2cid=56563996 }}</ref>
|| [[File:CTP1 function (2 variables).pdf|200px|CTP1 function (2 variables).<ref name="Deb:2002" />]]
|| <math>\text{Minimize} =
\begin{cases}
g_f_{1}\left(\boldsymbol{x},y\right) & = x_{1} + x_{2} - 2 \geq 0x \\
g_f_{2}\left(\boldsymbol{x},y\right) & = 6 - x_{\left(1} -+ x_{2}y\right) \geq 0exp \left(-\frac{x}{1+y} \right)
g_{3}\left(\boldsymbol{x}\right) & = 2 - x_{2} + x_{1} \geq 0 \\
g_{4}\left(\boldsymbol{x}\right) & = 2 - x_{1} + 3x_{2} \geq 0 \\
g_{5}\left(\boldsymbol{x}\right) & = 4 - \left(x_{3}-3\right)^{2} - x_{4} \geq 0 \\
g_{6}\left(\boldsymbol{x}\right) & = \left(x_{5} - 3\right)^{2} + x_{6} - 4 \geq 0
\end{cases}
</math>
||<math>\text{s.t.} =
 
: for <math>0\le x_{1},x_{2},x_{6} \le 10</math>, <math>1\le x_{3},x_{5} \le 5</math>, <math>0\le x_{4} \le 6</math>.
 
* CTP1 function (2 variables)<ref name="Deb:2002"/>:
 
{{multiple image
| width = 270
| footer = Test function for multi-objective optimization problems
| image1 = CTP1 function (2 variables).pdf
| caption1 = CTP1 function (2 variables).<ref name="Deb:2002"/>
| image2 = Constr-Ex problem.pdf
| caption2 = Constr-Ex problem.<ref name="Deb:2002"/>
}}
 
:: <math>\text{Minimize} =
\begin{cases}
f_g_{1}\left(x,y\right) & = \frac{f_{2}\left(x,y\right)}{0.858 \exp \left(-0.541 f_{1}\left(x,y\right)\right)} \geq 1 \\
f_g_{2}\left(x,y\right) & = \frac{f_{2}\left(1 + x,y\right)}{0.728 \exp \left(-\frac{x}0.295 f_{1+y} \left(x,y\right)\right)} \geq 1
\end{cases}
</math>
|| <math>0\le x,y \le 1</math>.
 
|-
:: <math>\text{s.t.} =
| Constr-Ex problem:<ref name="Deb:2002"/>
|| [[File:Constr-Ex problem.pdf|200px|Constr-Ex problem.<ref name="Deb:2002" />]]
|| <math>\text{Minimize} =
\begin{cases}
g_f_{1}\left(x,y\right) & = \frac{f_{2}\left(x,y\right)}{0.858 \exp \left(-0.541 f_{1}\left(x,y\right)\right)} \geq 1 \\
g_f_{12}\left(x,y\right) & = \frac{f_{2}\left(x,1 + y\right)}{0.728 \exp \left(-0.295 f_{1}\left(x,y\right)\right)} \geq 1\
\end{cases}
</math>
|| <math>\text{s.t.} =
 
: for <math>0\le x,y \le 1</math>.
 
* Constr-Ex problem<ref name="Deb:2002"/>:
 
:: <math>\text{Minimize} =
\begin{cases}
f_g_{1}\left(x,y\right) & = xy + 9x \geq 6 \\
f_g_{2}\left(x,y\right) & = \frac{1-y + y}{x}9x \geq 1 \\
\end{cases}
</math>
|| <math>0.1\le x \le 1</math>, <math>0\le y \le 5</math>
 
|-
:: <math>\text{s.t.} =
| Viennet function:
|| [[File:Viennet function.pdf|200px|Viennet function]]
|| <math>\text{Minimize} =
\begin{cases}
g_f_{1}\left(x,y\right) & = y0.5\left(x^{2} + 9xy^{2}\right) + \geqsin\left(x^{2} 6+ y^{2} \right) \\
g_f_{12}\left(x,y\right) & = \frac{\left(3x -y 2y + 4\right)^{2}}{8} + 9x \geqfrac{\left(x - y + 1\right)^{2}}{27} + 15 \\
f_{3}\left(x,y\right) = \frac{1}{x^{2} + y^{2} + 1} - 1.1 \exp \left(- \left(x^{2} + y^{2} \right) \right) \\
\end{cases}
</math>
||
 
: for ||<math>0.1-3\le x \le 1</math>, <math>0\le y \le 53</math>.
|}
 
== See also ==
* [[Himmelblau's function]]
* [[Rosenbrock function]]
* [[Rastrigin function]]
* [[Shekel function]]
 
==References==
 
<references/>
 
== External links ==
* [https://github.com/nathanrooy/landscapes landscapes]
 
{{DEFAULTSORT:Test functions for optimization}}
[[Category:Mathematical optimization]]
[[Category:Constraint programming]]
[[Category:Convex optimization]]
[[Category:Types ofTest functions for optimization| ]]
[[Category:Test items]]