Test functions for optimization: Difference between revisions

Content deleted Content added
Trobolt (talk | contribs)
m Make the symbol consistent
 
(68 intermediate revisions by 34 users not shown)
Line 1:
{{Short description|functionsFunctions used to evaluate optimization algorithms}}
In applied mathematics, '''test functions''', known as '''artificial landscapes''', are useful to evaluate characteristics of optimization algorithms, such as: [[Rate of convergence|convergence rate]], precision, robustness and general performance.
 
Here some test functions are presented with the aim of giving an idea about the different situations that optimization algorithms have to face when coping with these kinds of problems. In the first part, some objective functions for single-objective optimization cases are presented. In the second part, test functions with their respective [[Pareto front|Pareto fronts]] for [[multi-objective optimization]] problems (MOP) are given.
* Convergence rate.
* Precision.
* Robustness.
* General performance.
 
The artificial landscapes presented herein for single-objective optimization problems are taken from Bäck,<ref>{{cite book|last=Bäck|first=Thomas|title=Evolutionary algorithms in theory and practice : evolution strategies, evolutionary programming, genetic algorithms|year=1995|publisher=Oxford University Press|___location=Oxford|isbn=978-0-19-509971-3|page=328}}</ref> Haupt et al.<ref>{{cite book|last=Haupt|first=Randy L. Haupt, Sue Ellen|title=Practical genetic algorithms with CD-Rom|year=2004|publisher=J. Wiley|___location=New York|isbn=978-0-471-45565-3|edition=2nd}}</ref> and from Rody Oldenhuis software.<ref>{{cite web|last=Oldenhuis|first=Rody|title=Many test functions for global optimizers|url=http://www.mathworks.com/matlabcentral/fileexchange/23147-many-testfunctions-for-global-optimizers|publisher=Mathworks|accessdateaccess-date=1 November 2012}}</ref> Given the number of problems (55 in total), just a few are presented here. The complete list of test functions is found on the Mathworks website.<ref>{{cite web|last=Ortiz|first=Gilberto A.|title=Evolution Strategies (ES)|url=http://www.mathworks.com/matlabcentral/fileexchange/35801-evolution-strategies-es|publisher=Mathworks|accessdate=1 November 2012}}</ref>
Here some test functions are presented with the aim of giving an idea about the different situations that optimization algorithms have to face when coping with these kinds of problems. In the first part, some objective functions for single-objective optimization cases are presented. In the second part, test functions with their respective Pareto fronts for [[multi-objective optimization]] problems (MOP) are given.
 
The test functions used to evaluate the algorithms for MOP were taken from Deb,<ref name="Deb:2002">Deb, Kalyanmoy (2002) Multiobjective optimization using evolutionary algorithms (Repr. ed.). Chichester [u.a.]: Wiley. {{isbn|0-471-87339-X}}.</ref> Binh et al.<ref name="Binh97">Binh T. and Korn U. (1997) [https://web.archive.org/web/20190801183649/https://pdfs.semanticscholar.org/cf68/41a6848ca2023342519b0e0e536b88bdea1d.pdf MOBES: A Multiobjective Evolution Strategy for Constrained Optimization Problems]. In: Proceedings of the Third International Conference on Genetic Algorithms. Czech Republic. pp. 176–182</ref> and Binh.<ref name="Binh99">Binh T. (1999) [https://www.researchgate.net/profile/Thanh_Binh_To/publication/2446107_A_Multiobjective_Evolutionary_Algorithm_The_Study_Cases/links/53eb422f0cf28f342f45251d.pdf A multiobjective evolutionary algorithm. The study cases.] Technical report. Institute for Automation and Communication. Barleben, Germany</ref> You can download theThe software developed by Deb can be downloaded,<ref name="Deb_nsga">Deb K. (2011) Software for multi-objective NSGA-II code in C. Available at URL: https://www.iitk.ac.in/kangal/codes.shtml</ref> which implements the NSGA-II procedure with GAs, or the program posted on Internet,<ref>{{cite web|last=Ortiz|first=Gilberto A.|title=Multi-objective optimization using ES as Evolutionary Algorithm.|url=http://www.mathworks.com/matlabcentral/fileexchange/35824-multi-objective-optimization-using-evolution-strategies-es-as-evolutionary-algorithm-ea|publisher=Mathworks|accessdateaccess-date=1 November 2012}}</ref> which implements the NSGA-II procedure with ES.
The artificial landscapes presented herein for single-objective optimization problems are taken from Bäck,<ref>{{cite book|last=Bäck|first=Thomas|title=Evolutionary algorithms in theory and practice : evolution strategies, evolutionary programming, genetic algorithms|year=1995|publisher=Oxford University Press|___location=Oxford|isbn=978-0-19-509971-3|page=328}}</ref> Haupt et al.<ref>{{cite book|last=Haupt|first=Randy L. Haupt, Sue Ellen|title=Practical genetic algorithms with CD-Rom|year=2004|publisher=J. Wiley|___location=New York|isbn=978-0-471-45565-3|edition=2nd}}</ref> and from Rody Oldenhuis software.<ref>{{cite web|last=Oldenhuis|first=Rody|title=Many test functions for global optimizers|url=http://www.mathworks.com/matlabcentral/fileexchange/23147-many-testfunctions-for-global-optimizers|publisher=Mathworks|accessdate=1 November 2012}}</ref> Given the number of problems (55 in total), just a few are presented here. The complete list of test functions is found on the Mathworks website.<ref>{{cite web|last=Ortiz|first=Gilberto A.|title=Evolution Strategies (ES)|url=http://www.mathworks.com/matlabcentral/fileexchange/35801-evolution-strategies-es|publisher=Mathworks|accessdate=1 November 2012}}</ref>
 
The test functions used to evaluate the algorithms for MOP were taken from Deb,<ref name="Deb:2002">Deb, Kalyanmoy (2002) Multiobjective optimization using evolutionary algorithms (Repr. ed.). Chichester [u.a.]: Wiley. {{isbn|0-471-87339-X}}.</ref> Binh et al.<ref name="Binh97">Binh T. and Korn U. (1997) [https://pdfs.semanticscholar.org/cf68/41a6848ca2023342519b0e0e536b88bdea1d.pdf MOBES: A Multiobjective Evolution Strategy for Constrained Optimization Problems]. In: Proceedings of the Third International Conference on Genetic Algorithms. Czech Republic. pp. 176–182</ref> and Binh.<ref name="Binh99">Binh T. (1999) [https://www.researchgate.net/profile/Thanh_Binh_To/publication/2446107_A_Multiobjective_Evolutionary_Algorithm_The_Study_Cases/links/53eb422f0cf28f342f45251d.pdf A multiobjective evolutionary algorithm. The study cases.] Technical report. Institute for Automation and Communication. Barleben, Germany</ref> You can download the software developed by Deb,<ref name="Deb_nsga">Deb K. (2011) Software for multi-objective NSGA-II code in C. Available at URL: https://www.iitk.ac.in/kangal/codes.shtml</ref> which implements the NSGA-II procedure with GAs, or the program posted on Internet,<ref>{{cite web|last=Ortiz|first=Gilberto A.|title=Multi-objective optimization using ES as Evolutionary Algorithm.|url=http://www.mathworks.com/matlabcentral/fileexchange/35824-multi-objective-optimization-using-evolution-strategies-es-as-evolutionary-algorithm-ea|publisher=Mathworks|accessdate=1 November 2012}}</ref> which implements the NSGA-II procedure with ES.
 
Just a general form of the equation, a plot of the objective function, boundaries of the object variables and the coordinates of global minima are given herein.
Line 17 ⟶ 12:
==Test functions for single-objective optimization==
 
{| class="sortable wikitable" style="text-align:center"
! Name
|-
! Plot
! Name !! Plot !! Formula !! Global minimum !! Search ___domain
! Formula
! Global minimum
! Search ___domain
|-
| [[Rastrigin function]]
|| [[File:Rastrigin_functionRastrigin contour plot.pngsvg|200px|Rastrigin function for n=2]]
||<math>f(\mathbf{x}) = A n + \sum_{i=1}^n \left[x_i^2 - A\cos(2 \pi x_i)\right]</math>
<math>\text{where: } A=10</math>
||<math>f(0, \dots, 0) = 0</math>
||<math>-5.12\le x_{i} \le 5.12 </math>
|-
| [[Ackley function]]
|| [[File:Ackley's contour function.pdfsvg|200px|Ackley's function for n=2]]
||<math>f(x,y) = -20\exp\left[-0.2\sqrt{0.5\left(x^{2}+y^{2}\right)}\right]</math>
<math>-\exp\left[0.5\left(\cos 2\pi x + \cos 2\pi y \right)\right] + e + 20</math>
||<math>f(0,0) = 0</math>
||<math>-5\le x,y \le 5</math>
|-
| Sphere function
|| [[File:Sphere function in 3Dcontour.pdfsvg|200px|Sphere function for n=2]]
|| <math>f(\boldsymbol{x}) = \sum_{i=1}^{n} x_{i}^{2}</math>
|| <math>f(x_{1}, \dots, x_{n}) = f(0, \dots, 0) = 0</math>
|| <math>-\infty \le x_{i} \le \infty</math>, <math>1 \le i \le n</math>
|-
| [[Rosenbrock function]]
|| [[File:Rosenbrock's function in 3Dcontour.pdfsvg|200px|Rosenbrock's function for n=2]]
|| <math>f(\boldsymbol{x}) = \sum_{i=1}^{n-1} \left[ 100 \left(x_{i+1} - x_{i}^{2}\right)^{2} + \left(1 - x_{i}\right)^{2}\right]</math>
|| <math>\text{Min} =
\begin{cases}
n=2 & \rightarrow \quad f(1,1) = 0, \\
Line 51 ⟶ 49:
\end{cases}
</math>
|| <math>-\infty \le x_{i} \le \infty</math>, <math>1 \le i \le n</math>
|-
| [[Beale function]]
|| [[File:Beale's functioncontour.pdfsvg|200px|Beale's function]]
|| <math>f(x,y) = \left( 1.5 - x + xy \right)^{2} + \left( 2.25 - x + xy^{2}\right)^{2}</math>
<math>+ \left(2.625 - x+ xy^{3}\right)^{2}</math>
|| <math>f(3, 0.5) = 0</math>
|| <math>-4.5 \le x,y \le 4.5</math>
|-
| [[Goldstein–Price function]]
|| [[File:Goldstein -Price functioncontour.pdfsvg|200px|Goldstein–Price function]]
|| <math>f(x,y) = \left[1+\left(x+y+1\right)^{2}\left(19-14x+3x^{2}-14y+6xy+3y^{2}\right)\right]</math>
<math>\left[30+\left(2x-3y\right)^{2}\left(18-32x+12x^{2}+48y-36xy+27y^{2}\right)\right]</math>
|| <math>f(0, -1) = 3</math>
|| <math>-2 \le x,y \le 2</math>
|-
| [[Booth function]]
|| [[File:Booth's functioncontour.pdfsvg|200px|Booth's function]]
||<math>f(x,y) = \left( x + 2y -7\right)^{2} + \left(2x +y - 5\right)^{2}</math>
||<math>f(1,3) = 0</math>
||<math>-10 \le x,y \le 10</math>
|-
| Bukin function N.6
|| [[File:Bukin function 6 contour.pdfsvg|200px|Bukin function N.6]]
|| <math>f(x,y) = 100\sqrt{\left|y - 0.01x^{2}\right|} + 0.01 \left|x+10 \right|.\quad</math>
|| <math>f(-10,1) = 0</math>
|| <math>-15\le x \le -5</math>, <math>-3\le y \le 3</math>
|-
| [[Matyas function]]
|| [[File:Matyas functioncontour.pdfsvg|200px|Matyas function]]
|| <math>f(x,y) = 0.26 \left( x^{2} + y^{2}\right) - 0.48 xy</math>
|| <math>f(0,0) = 0</math>
|| <math>-10\le x,y \le 10</math>
|-
| Lévi function N.13
||[[File:LeviLevi13 function 13contour.pdfsvg|200px|Lévi function N.13]]
|| <math>f(x,y) = \sin^{2} 3\pi x + \left(x-1\right)^{2}\left(1+\sin^{2} 3\pi y\right)</math>
<math>+\left(y-1\right)^{2}\left(1+\sin^{2} 2\pi y\right)</math>
|| <math>f(1,1) = 0</math>
|| <math>-10\le x,y \le 10</math>
|-
*| [[AckleyGriewank function]]
| [[File:Griewank 2D Contour.svg|200px|Griewank's function]]
| <math>f(\boldsymbol{x})= 1+ \frac {1}{4000} \sum _{i=1}^n x_i^2 -\prod _{i=1}^n P_i(x_i)</math>, where <math>P_i(x_i)=\cos \left( \frac {x_i}{\sqrt {i}} \right)</math>
|| <math>f(1.0,1. \dots, 0) = 0</math>
|| <math>-1.5\infty \le xx_{i} \le 1.5\infty</math>, <math>-0.51 \le yi \le 2.5n</math>
|-
| [[Himmelblau's function]]
||[[File:Himmelblau functioncontour plot.svg|200px|Himmelblau's function]]
|| <math>f(x, y) = (x^2+y-11)^2 + (x+y^2-7)^2.\quad</math>
|| <math>\text{Min} =
\begin{cases}
f\left(3.0, 2.0\right) & = 0.0 \\
Line 103 ⟶ 107:
\end{cases}
</math>
|| <math>-5\le x,y \le 5</math>
|-
| Three-hump camel function
|| [[File:Three-hump-camel Hump Camel functioncontour.pdfsvg|200px|Three Hump Camel function]]
|| <math>f(x,y) = 2x^{2} - 1.05x^{4} + \frac{x^{6}}{6} + xy + y^{2}</math>
|| <math>f(0,0) = 0</math>
|| <math>-5\le x,y \le 5</math>
|-
| [[Easom function]]
|| [[File:Easom functioncontour.pdfsvg|200px|Easom function]]
|| <math>f(x,y) = -\cos \left(x\right)\cos \left(y\right) \exp\left(-\left(\left(x-\pi\right)^{2} + \left(y-\pi\right)^{2}\right)\right)</math>
|| <math>f(\pi , \pi) = -1</math>
|| <math>-100\le x,y \le 100</math>
|-
| Cross-in-tray function
|| [[File:Cross-in-tray functioncontour.pdfsvg|200px|Cross-in-tray function]]
|| <math>f(x,y) = -0.0001 \left[ \left| \sin x \sin y \exp \left(\left|100 - \frac{\sqrt{x^{2} + y^{2}}}{\pi} \right|\right)\right| + 1 \right]^{0.1}</math>
|| <math>\text{Min} =
\begin{cases}
f\left(1.34941, -1.34941\right) & = -2.06261 \\
Line 128 ⟶ 132:
\end{cases}
</math>
|| <math>-10\le x,y \le 10</math>
|-
| [[Eggholder function]]<ref name="Whitley Rana Dzubera Mathias 1996 pp. 245–276">{{cite journal | last1=Whitley | first1=Darrell | last2=Rana | first2=Soraya | last3=Dzubera | first3=John | last4=Mathias | first4=Keith E. | title=Evaluating evolutionary algorithms | journal=Artificial Intelligence | publisher=Elsevier BV | volume=85 | issue=1–2 | year=1996 | issn=0004-3702 | doi=10.1016/0004-3702(95)00124-7 | pages=264| doi-access=free }}</ref><ref name="vanaret2015hybridation">Vanaret C. (2015) [https://www.researchgate.net/publication/337947149_Hybridization_of_interval_methods_and_evolutionary_algorithms_for_solving_difficult_optimization_problems Hybridization of interval methods and evolutionary algorithms for solving difficult optimization problems.] PhD thesis. Ecole Nationale de l'Aviation Civile. Institut National Polytechnique de Toulouse, France.</ref>
| [[Eggholder function]] <ref name="Vanaret2014">Vanaret C., Gotteland J-B., Durand N., Alliot J-M. (2014) [https://hal-enac.archives-ouvertes.fr/hal-00996713/document Certified Global Minima for a Benchmark of Difficult Optimization Problems.] Technical report. Ecole Nationale de l'Aviation Civile. Toulouse, France.</ref>
|| [[File:Eggholder functioncontour.pdfsvg|200px|Eggholder function]]
|| <math>f(x,y) = - \left(y+47\right) \sin \sqrt{\left|\frac{x}{2}+\left(y+47\right)\right|} - x \sin \sqrt{\left|x - \left(y + 47 \right)\right|}</math>
|| <math>f(512, 404.2319) = -959.6407</math>
|| <math>-512\le x,y \le 512</math>
|-
| [[Hölder table function]]
|| [[File:HolderHoelder table functioncontour.pdfsvg|200px|Holder table function]]
|| <math>f(x,y) = - \left|\sin x \cos y \exp \left(\left|1 - \frac{\sqrt{x^{2} + y^{2}}}{\pi} \right|\right)\right|</math>
|| <math>\text{Min} =
\begin{cases}
f\left(8.05502, 9.66459\right) & = -19.2085 \\
Line 147 ⟶ 151:
\end{cases}
</math>
|| <math>-10\le x,y \le 10</math>
|-
| [[McCormick function]]
|| [[File:McCormick functioncontour.pdfsvg|200px|McCormick function]]
|| <math>f(x,y) = \sin \left(x+y\right) + \left(x-y\right)^{2} - 1.5x + 2.5y + 1</math>
|| <math>f(-0.54719,-1.54719) = -1.9133</math>
|| <math>-1.5\le x \le 4</math>, <math>-3\le y \le 4</math>
|-
| Schaffer function N. 2
|| [[File:SchafferSchaffer2 function 2contour.pdfsvg|200px|Schaffer function N.2]]
|| <math>f(x,y) = 0.5 + \frac{\sin^{2}\left(x^{2} - y^{2}\right) - 0.5}{\left[1 + 0.001\left(x^{2} + y^{2}\right) \right]^{2}}</math>
|| <math>f(0, 0) = 0</math>
|| <math>-100\le x,y \le 100</math>
|-
| Schaffer function N. 4
|| [[File:SchafferSchaffer4 function 4contour.pdfsvg|200px|Schaffer function N.4]]
|| <math>f(x,y) = 0.5 + \frac{\cos^{2}\left[\sin \left( \left|x^{2} - y^{2}\right|\right)\right] - 0.5}{\left[1 + 0.001\left(x^{2} + y^{2}\right) \right]^{2}}</math>
|| <math>\text{Min} =
\begin{cases}
f\left(0,1.25313\right) & = 0.292579 \\
f\left(0,-1.25313\right) & = 0.292579 \\
f\left(1.25313,0\right) & = 0.292579 \\
f\left(-1.25313,0\right) & = 0.292579
\end{cases}
</math>
|| <math>-100\le x,y \le 100</math>
|-
| [[Styblinski–Tang function]]
|| [[File:Styblinski-Tang functioncontour.pdfsvg|200px|Styblinski-Tang function]]
|| <math>f(\boldsymbol{x}) = \frac{\sum_{i=1}^{n} x_{i}^{4} - 16x_{i}^{2} + 5x_{i}}{2}</math>
|| <math>-39.16617n < f(\underbrace{-2.903534, \ldots, -2.903534}_{n \text{ times}} ) < -39.16616n</math>
|| <math>-5\le x_{i} \le 5</math>, <math>1\le i \le n</math>..
|-
*| [[Shekel function]]
| [[Image:Shekel_2D.jpg|200px|A Shekel function in 2 dimensions and with 10 maxima]]
| <math>
f(\boldsymbol{x}) = \sum_{i = 1}^{m} \; \left( c_{i} + \sum\limits_{j = 1}^{n} (x_{j} - a_{ji})^2 \right)^{-1}
</math>
|
| <math>-\infty \le x_{i} \le \infty</math>, <math>1 \le i \le n</math>
|}
 
Line 184 ⟶ 198:
|-
! Name !! Plot !! Formula !! Global minimum !! Search ___domain
|-
| Rosenbrock function constrained with a cubic and a line<ref>{{cite conference |author1=Simionescu, P.A. |author2=Beale, D. |title=New Concepts in Graphic Visualization of Objective Functions |conference=ASME 2002 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference |___location=Montreal, Canada |date=September 29 – October 2, 2002|pages=891–897 |url=http://faculty.tamucc.edu/psimionescu/PDFs/DETC02-DAC-34129.pdf |accessdate=7 January 2017 }}</ref>
|| [[File:ConstrTestFunc04.png|200px|Rosenbrock function constrained with a cubic and a line]]
|| <math>f(x,y) = (1-x)^2 + 100(y-x^2)^2</math>,
 
subjected to: <math> (x-1)^3 - y + 1 \le 0 \text{ and } x + y - 2 \le 0 </math>
|| <math>f(1.0,1.0) = 0</math>
|| <math>-1.5\le x \le 1.5</math>, <math>-0.5\le y \le 2.5</math>
|-
| Rosenbrock function constrained to a disk<ref>{{Cite web|url=https://www.mathworks.com/help/optim/ug/example-nonlinear-constrained-minimization.html?requestedDomain=www.mathworks.com|title=Solve a Constrained Nonlinear Problem - MATLAB & Simulink|website=www.mathworks.com|access-date=2017-08-29}}</ref>
|| [[File:ConstrTestFunc03Rosenbrock circle constraint.pngsvg|200px|Rosenbrock function constrained to a disk]]
|| <math>f(x,y) = (1-x)^2 + 100(y-x^2)^2</math>,
 
Line 201 ⟶ 207:
|| <math>-1.5\le x \le 1.5</math>, <math>-1.5\le y \le 1.5</math>
|-
| Mishra's Bird function - constrained<ref>{{Cite web|url=http://www.phoenix-int.com/software/benchmark_report/bird_constrained.php|title=Bird Problem (Constrained) {{!}} Phoenix Integration|access-date=2017-08-29|url-status=bot: unknown|archiveurlarchive-url=https://web.archive.org/web/20161229032528/http://www.phoenix-int.com/software/benchmark_report/bird_constrained.php|archivedatearchive-date=2016-12-29}}</ref><ref>{{Cite journal|last=Mishra|first=Sudhanshu|date=2006|title=Some new test functions for global optimization and performance of repulsive particle swarm method|url=https://mpra.ub.uni-muenchen.de/2718/|journal=MPRA Paper|volume=|pages=|via=}}</ref>
|| [[File:ConstrTestFunc01Mishra bird contour.pngsvg|200px|Bird function (constrained)]]
|| <math>f(x,y) = \sin(y) e^{\left [(1-\cos x)^2\right]} + \cos(x) e^{\left [(1-\sin y)^2 \right]} + (x-y)^2</math>,
subjected to: <math> (x+5)^2 + (y+5)^2 < 25 </math>
Line 209 ⟶ 215:
|-
| Townsend function (modified)<ref>{{Cite web|url=http://www.chebfun.org/examples/opt/ConstrainedOptimization.html|title=Constrained optimization in Chebfun|last=Townsend|first=Alex|date=January 2014|website=chebfun.org|access-date=2017-08-29}}</ref>
|| [[File:ConstrTestFunc02Townsend contour.pngsvg|200px|Heart constrained multimodal function]]
|| <math>f(x,y) = -[\cos((x-0.1)y)]^2 - x \sin(3x+y)</math>,
 
subjected to:<math>x^2+y^2 < \left[2\cos t - \frac 1 2 \cos 2t - \frac 1 4 \cos 3t - \frac 1 8 \cos 4t\right]^2 + [2\sin t]^2 </math>
where: {{Math|1=''t'' = Atan2(x,y)}}
|| <math>f(2.0052938,1.1944509) = -2.0239884</math>
|| <math>-2.25\le x \le 2.525</math>, <math>-2.5\le y \le 1.75</math>
 
|-
| '''Keane's bump function'''{{anchor|Keane's bump function}}<ref>{{cite journal |last1=Mishra |first1=Sudhanshu |title=Minimization of Keane’s Bump Function by the Repulsive Particle Swarm and the Differential Evolution Methods |date=5 May 2007 |url=https://econpapers.repec.org/paper/pramprapa/3098.htm |journal=MPRA Paper|publisher=University Library of Munich, Germany}}</ref>
| [[Simionescu function]]<ref>{{cite book|last=Simionescu|first=P.A.|title=Computer Aided Graphing and Simulation Tools for AutoCAD Users|year=2014|publisher=CRC Press|___location=Boca Raton, FL|isbn=978-1-4822-5290-3|edition=1st}}</ref>
|| [[File:Simionescu'sEstimation functionof Distribution Algorithm animation.svggif|200px|SimionescuKeane's bump function]]
|| <math>f(\boldsymbol{x}) = -\left| \frac{\left[ \sum_{i=1}^m \cos^4 (x_i) - 2 \prod_{i=1}^m \cos^2 (x_i) \right]}{{\left( \sum_{i=1}^m ix^2_i \right)}^{0.5}} \right| </math>,
|| <math>f(x,y) = 0.1xy</math>,
subjected to: <math> x^2+y^2\le\left[r_{T}+r_{S}\cos\left(n0.75 \arctan- \fracprod_{x}{yi=1}^m x_i < 0 \right)\right]^2</math>, and
<math>\text{where: } r_\sum_{T}i=1, r_{S}=0.2^m \text{x_i and- }7.5m n< =0 8</math>
|| <math>f(\pm 0(1.8485281360025376,\mp 0.84852813468675907)) = -0.072364979746</math>
|| <math>-1.25\le0 x,y< \lex_i 1.25< 10</math>
|}
 
Line 234 ⟶ 240:
! Name !! Plot !! Functions !! Constraints !! Search ___domain
|-
| [[Binh and Korn function]]:<ref name="Binh97"/>
| [[Binh and Korn function]]:<ref name="Binh97">Binh T. and Korn U. (1997) [https://pdfs.semanticscholar.org/cf68/41a6848ca2023342519b0e0e536b88bdea1d.pdf MOBES: A Multiobjective Evolution Strategy for Constrained Optimization Problems]. In: Proceedings of the Third International Conference on Genetic Algorithms. Czech Republic. pp. 176–182</ref>
|| [[File:Binh and Korn function.pdf|200px|Binh and Korn function]]
|| <math>\text{Minimize} =
Line 250 ⟶ 256:
|| <math>0\le x \le 5</math>, <math>0\le y \le 3</math>
|-
| [[Chankong and Haimes function]]:<ref>{{cite book |last1=Chankong |first1=Vira |last2=Haimes |first2=Yacov Y. |title=Multiobjective decision making. Theory and methodology. |isbn=0-444-00710-5|year=1983 |publisher=North Holland }}</ref>
|| [[File:Chakong and Haimes function.pdf|200px|Chakong and Haimes function]]
|| <math>\text{Minimize} =
Line 266 ⟶ 272:
|| <math>-20\le x,y \le 20</math>
|-
| [[Fonseca–Fleming function]]:<ref name="FonzecaFleming:1995">{{cite journal |firstfirst1=C. M. |lastlast1=Fonseca |first2=P. J. |last2=Fleming |title=An Overview of Evolutionary Algorithms in Multiobjective Optimization |journal=[[Evolutionary Computation (journal)|Evol Comput]] |volume=3 |issue=1 |pages=1–16 |year=1995 |doi=10.1162/evco.1995.3.1.1 |citeseerx=10.1.1.50.7779 |s2cid=8530790 }}</ref>
|| [[File:Fonseca and Fleming function.pdf|200px|Fonseca and Fleming function]]
|| <math>\text{Minimize} =
Line 294 ⟶ 300:
|| <math>-7\le x,y \le 4</math>
|-
| [[Kursawe function]]:<ref name="Kursawe:1991">F. Kursawe, “[http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.47.8050&rep=rep1&type=pdf A variant of evolution strategies for vector optimization],” in [[Parallel Problem Solving from Nature|PPSN]] I, Vol 496 Lect Notes in Comput Sc. Springer-Verlag, 1991, pp.&nbsp;193–197.</ref>
|| [[File:Kursawe function.pdf|200px|Kursawe function]]
|| <math>\text{Minimize} =
Line 306 ⟶ 312:
||<math>-5\le x_{i} \le 5</math>, <math>1\le i \le 3</math>.
|-
| Schaffer function N. 1:<ref name="Schaffer:1984">{{cite thesis |type=PhDbook |last=Schaffer |first=J. David |date=1984 |titlechapter=Multiple Objective Optimization with Vector Evaluated Genetic Algorithms. |publisher=Vanderbilt University |journaltitle=Proceedings of the First Int.International Conference on Genetic Algortihms, Ed.Algorithms |editor1=G.J.E Grefensette, |editor2=J.J. Lawrence Erlbraum |oclc=20004572 }}</ref>
|| [[File:Schaffer function 1.pdf|200px|Schaffer function N.1]]
|| <math>\text{Minimize} =
Line 352 ⟶ 358:
||<math>-\pi\le x,y \le \pi</math>
|-
| Zitzler–Deb–Thiele's function N. 1:<ref name="Debetal2002testpr">{{cite journalbook |last1=Deb |first1=Kalyan |last2=Thiele |first2=L. |last3=Laumanns |first3=Marco |last4=Zitzler |first4=Eckart |title=ScalableProceedings multi-objectiveof optimizationthe test2002 problemsCongress |journal=Procon Evolutionary Computation. ofCEC'02 2002(Cat. IEEENo.02TH8600) Congress|chapter=Scalable onmulti-objective Evolutionaryoptimization Computationtest problems |date=2002 |volume=1 |pages=825-830825–830 |doi=10.1109/CEC.2002.1007032|isbn=0-7803-7282-4 |s2cid=61001583 }}</ref>
|| [[File:Zitzler-Deb-Thiele's function 1.pdf|200px|Zitzler-Deb-Thiele's function N.1]]
|| <math>\text{Minimize} =
Line 417 ⟶ 423:
||<math>0\le x_{i} \le 1</math>, <math>1\le i \le 10</math>.
|-
| Osyczka and Kundu function:<ref name="OsyczkaKundu1995">{{cite journal |last1=Osyczka |first1=A. |last2=Kundu |first2=S. |title=A new method to solve generalized multicriteria optimization problems using the simple genetic algorithm |journal=Structural optimizationOptimization |date=011 OctOctober 1995 |volume=10 |issue=2 |pages=94--9994–99 |doi=10.1007/BF01743536 |urls2cid=https://doi.org/10.1007/BF01743536123433499 |issn=1615-1488}}</ref>
|| [[File:Osyczka and Kundu function.pdf|200px|Osyczka and Kundu function]]
||<math>\text{Minimize} =
Line 438 ⟶ 444:
|| <math>0\le x_{1},x_{2},x_{6} \le 10</math>, <math>1\le x_{3},x_{5} \le 5</math>, <math>0\le x_{4} \le 6</math>.
|-
| CTP1 function (2 variables):<ref name="Deb:2002"/><ref name="Jimenezetal2002">{{cite book |last1=Jimenez |first1=F. |last2=Gomez-Skarmeta |first2=A. F. |last3=Sanchez |first3=G. |last4=Deb |first4=K. |title=Proceedings of the 2002 Congress on Evolutionary Computation. CEC'02 (Cat. No.02TH8600) |chapter=An evolutionary algorithm for constrained multi-objective optimization |date=May 2002 |volume=2 |pages=1133–1138 |doi=10.1109/CEC.2002.1004402|isbn=0-7803-7282-4 |s2cid=56563996 }}</ref>
| CTP1 function (2 variables):<ref name="Deb:2002"/>
|| [[File:CTP1 function (2 variables).pdf|200px|CTP1 function (2 variables).<ref name="Deb:2002" />]]
|| <math>\text{Minimize} =
Line 465 ⟶ 471:
\begin{cases}
g_{1}\left(x,y\right) = y + 9x \geq 6 \\
g_{12}\left(x,y\right) = -y + 9x \geq 1 \\
\end{cases}
</math>
Line 482 ⟶ 488:
||<math>-3\le x,y \le 3</math>.
|}
 
== See also ==
{{commons category|Test functions (mathematical optimization)}}
* [[Ackley function]]
* [[Himmelblau's function]]
* [[Rastrigin function]]
* [[Rosenbrock function]]
* [[Shekel function]]
 
==References==
 
<references/>
 
== External links ==
* [https://github.com/nathanrooy/landscapes landscapes]
*[https://www.sfu.ca/~ssurjano/index.html Virtual Library of Simulation Experiments: Test Functions and Datasets]
*[http://benchmarkfcns.xyz/fcns Benchmarkfcns] - Categorized collection of optimization benchmark functions and source code
*[http://infinity77.net/global_optimization/test_functions.html Test Functions Index] - with an estimate of "hardness" of the problem
*[https://www.cs.unm.edu/~neal.holts/dga/benchmarkFunction/index.html Benchmark functions] - Categorized list
*[http://www-optima.amp.i.kyoto-u.ac.jp/member/student/hedar/Hedar_files/TestGO.htm Global Optimization Test Problems] - Constrained and unconstrained
*[https://deap.readthedocs.io/en/master/api/benchmarks.html DEAP Benchmarks]
 
{{DEFAULTSORT:Test functions for optimization}}
[[Category:Constraint programming]]
[[Category:Convex optimization]]
[[Category:Types ofTest functions for optimization| ]]
[[Category:Test items]]