Test functions for optimization: Difference between revisions

Content deleted Content added
No edit summary
Trobolt (talk | contribs)
m Make the symbol consistent
 
(48 intermediate revisions by 20 users not shown)
Line 1:
{{Short description|functionsFunctions used to evaluate optimization algorithms}}
In applied mathematics, '''test functions''', known as '''artificial landscapes''', are useful to evaluate characteristics of optimization algorithms, such as: [[Rate of convergence|convergence rate]], precision, robustness and general performance.
 
Here some test functions are presented with the aim of giving an idea about the different situations that optimization algorithms have to face when coping with these kinds of problems. In the first part, some objective functions for single-objective optimization cases are presented. In the second part, test functions with their respective [[Pareto front|Pareto fronts]] for [[multi-objective optimization]] problems (MOP) are given.
* Convergence rate.
* Precision.
* Robustness.
* General performance.
 
Here some test functions are presented with the aim of giving an idea about the different situations that optimization algorithms have to face when coping with these kinds of problems. In the first part, some objective functions for single-objective optimization cases are presented. In the second part, test functions with their respective Pareto fronts for [[multi-objective optimization]] problems (MOP) are given.
 
The artificial landscapes presented herein for single-objective optimization problems are taken from Bäck,<ref>{{cite book|last=Bäck|first=Thomas|title=Evolutionary algorithms in theory and practice : evolution strategies, evolutionary programming, genetic algorithms|year=1995|publisher=Oxford University Press|___location=Oxford|isbn=978-0-19-509971-3|page=328}}</ref> Haupt et al.<ref>{{cite book|last=Haupt|first=Randy L. Haupt, Sue Ellen|title=Practical genetic algorithms with CD-Rom|year=2004|publisher=J. Wiley|___location=New York|isbn=978-0-471-45565-3|edition=2nd}}</ref> and from Rody Oldenhuis software.<ref>{{cite web|last=Oldenhuis|first=Rody|title=Many test functions for global optimizers|url=http://www.mathworks.com/matlabcentral/fileexchange/23147-many-testfunctions-for-global-optimizers|publisher=Mathworks|access-date=1 November 2012}}</ref> Given the number of problems (55 in total), just a few are presented here.
 
The test functions used to evaluate the algorithms for MOP were taken from Deb,<ref name="Deb:2002">Deb, Kalyanmoy (2002) Multiobjective optimization using evolutionary algorithms (Repr. ed.). Chichester [u.a.]: Wiley. {{isbn|0-471-87339-X}}.</ref> Binh et al.<ref name="Binh97">Binh T. and Korn U. (1997) [https://web.archive.org/web/20190801183649/https://pdfs.semanticscholar.org/cf68/41a6848ca2023342519b0e0e536b88bdea1d.pdf MOBES: A Multiobjective Evolution Strategy for Constrained Optimization Problems]. In: Proceedings of the Third International Conference on Genetic Algorithms. Czech Republic. pp. 176–182</ref> and Binh.<ref name="Binh99">Binh T. (1999) [https://www.researchgate.net/profile/Thanh_Binh_To/publication/2446107_A_Multiobjective_Evolutionary_Algorithm_The_Study_Cases/links/53eb422f0cf28f342f45251d.pdf A multiobjective evolutionary algorithm. The study cases.] Technical report. Institute for Automation and Communication. Barleben, Germany</ref> You can download theThe software developed by Deb can be downloaded,<ref name="Deb_nsga">Deb K. (2011) Software for multi-objective NSGA-II code in C. Available at URL: https://www.iitk.ac.in/kangal/codes.shtml</ref> which implements the NSGA-II procedure with GAs, or the program posted on Internet,<ref>{{cite web|last=Ortiz|first=Gilberto A.|title=Multi-objective optimization using ES as Evolutionary Algorithm.|url=http://www.mathworks.com/matlabcentral/fileexchange/35824-multi-objective-optimization-using-evolution-strategies-es-as-evolutionary-algorithm-ea|publisher=Mathworks|access-date=1 November 2012}}</ref> which implements the NSGA-II procedure with ES.
 
Just a general form of the equation, a plot of the objective function, boundaries of the object variables and the coordinates of global minima are given herein.
Line 17 ⟶ 12:
==Test functions for single-objective optimization==
 
{| class="sortable wikitable" style="text-align:center"
! Name
|-
! Plot
! Name !! Plot !! Formula !! Global minimum !! Search ___domain
! Formula
! Global minimum
! Search ___domain
|-
| [[Rastrigin function]]
|| [[File:Rastrigin_functionRastrigin contour plot.pngsvg|200px|Rastrigin function for n=2]]
||<math>f(\mathbf{x}) = A n + \sum_{i=1}^n \left[x_i^2 - A\cos(2 \pi x_i)\right]</math>
<math>\text{where: } A=10</math>
||<math>f(0, \dots, 0) = 0</math>
||<math>-5.12\le x_{i} \le 5.12 </math>
|-
| [[Ackley function]]
|| [[File:Ackley's contour function.pdfsvg|200px|Ackley's function for n=2]]
||<math>f(x,y) = -20\exp\left[-0.2\sqrt{0.5\left(x^{2}+y^{2}\right)}\right]</math>
<math>-\exp\left[0.5\left(\cos 2\pi x + \cos 2\pi y \right)\right] + e + 20</math>
||<math>f(0,0) = 0</math>
||<math>-5\le x,y \le 5</math>
|-
| Sphere function
|| [[File:Sphere function in 3Dcontour.pdfsvg|200px|Sphere function for n=2]]
|| <math>f(\boldsymbol{x}) = \sum_{i=1}^{n} x_{i}^{2}</math>
|| <math>f(x_{1}, \dots, x_{n}) = f(0, \dots, 0) = 0</math>
|| <math>-\infty \le x_{i} \le \infty</math>, <math>1 \le i \le n</math>
|-
| [[Rosenbrock function]]
|| [[File:Rosenbrock's function in 3Dcontour.pdfsvg|200px|Rosenbrock's function for n=2]]
|| <math>f(\boldsymbol{x}) = \sum_{i=1}^{n-1} \left[ 100 \left(x_{i+1} - x_{i}^{2}\right)^{2} + \left(1 - x_{i}\right)^{2}\right]</math>
|| <math>\text{Min} =
\begin{cases}
n=2 & \rightarrow \quad f(1,1) = 0, \\
Line 51 ⟶ 49:
\end{cases}
</math>
|| <math>-\infty \le x_{i} \le \infty</math>, <math>1 \le i \le n</math>
|-
| [[Beale function]]
|| [[File:Beale's functioncontour.pdfsvg|200px|Beale's function]]
|| <math>f(x,y) = \left( 1.5 - x + xy \right)^{2} + \left( 2.25 - x + xy^{2}\right)^{2}</math>
<math>+ \left(2.625 - x+ xy^{3}\right)^{2}</math>
|| <math>f(3, 0.5) = 0</math>
|| <math>-4.5 \le x,y \le 4.5</math>
|-
| [[Goldstein–Price function]]
|| [[File:Goldstein -Price functioncontour.pdfsvg|200px|Goldstein–Price function]]
|| <math>f(x,y) = \left[1+\left(x+y+1\right)^{2}\left(19-14x+3x^{2}-14y+6xy+3y^{2}\right)\right]</math>
<math>\left[30+\left(2x-3y\right)^{2}\left(18-32x+12x^{2}+48y-36xy+27y^{2}\right)\right]</math>
|| <math>f(0, -1) = 3</math>
|| <math>-2 \le x,y \le 2</math>
|-
| [[Booth function]]
|| [[File:Booth's functioncontour.pdfsvg|200px|Booth's function]]
||<math>f(x,y) = \left( x + 2y -7\right)^{2} + \left(2x +y - 5\right)^{2}</math>
||<math>f(1,3) = 0</math>
||<math>-10 \le x,y \le 10</math>
|-
| Bukin function N.6
|| [[File:Bukin function 6 contour.pdfsvg|200px|Bukin function N.6]]
|| <math>f(x,y) = 100\sqrt{\left|y - 0.01x^{2}\right|} + 0.01 \left|x+10 \right|.\quad</math>
|| <math>f(-10,1) = 0</math>
|| <math>-15\le x \le -5</math>, <math>-3\le y \le 3</math>
|-
| [[Matyas function]]
|| [[File:Matyas functioncontour.pdfsvg|200px|Matyas function]]
|| <math>f(x,y) = 0.26 \left( x^{2} + y^{2}\right) - 0.48 xy</math>
|| <math>f(0,0) = 0</math>
|| <math>-10\le x,y \le 10</math>
|-
| Lévi function N.13
||[[File:LeviLevi13 function 13contour.pdfsvg|200px|Lévi function N.13]]
|| <math>f(x,y) = \sin^{2} 3\pi x + \left(x-1\right)^{2}\left(1+\sin^{2} 3\pi y\right)</math>
<math>+\left(y-1\right)^{2}\left(1+\sin^{2} 2\pi y\right)</math>
|| <math>f(1,1) = 0</math>
|| <math>-10\le x,y \le 10</math>
|-
*| [[AckleyGriewank function]]
|| [[File:Simionescu'sGriewank function2D Contour.svg|200px|SimionescuGriewank's function]]
| <math>f(\boldsymbol{x})= 1+ \frac {1}{4000} \sum _{i=1}^n x_i^2 -\prod _{i=1}^n P_i(x_i)</math>, where <math>P_i(x_i)=\cos \left( \frac {x_i}{\sqrt {i}} \right)</math>
|| <math>f(1.0,1. \dots, 0) = 0</math>
|| <math>-1.5\infty \le xx_{i} \le 1.5\infty</math>, <math>-0.51 \le yi \le 2.5n</math>
|-
| [[Himmelblau's function]]
||[[File:Himmelblau functioncontour plot.svg|200px|Himmelblau's function]]
|| <math>f(x, y) = (x^2+y-11)^2 + (x+y^2-7)^2.\quad</math>
|| <math>\text{Min} =
\begin{cases}
f\left(3.0, 2.0\right) & = 0.0 \\
Line 103 ⟶ 107:
\end{cases}
</math>
|| <math>-5\le x,y \le 5</math>
|-
| Three-hump camel function
|| [[File:Three-hump-camel Hump Camel functioncontour.pdfsvg|200px|Three Hump Camel function]]
|| <math>f(x,y) = 2x^{2} - 1.05x^{4} + \frac{x^{6}}{6} + xy + y^{2}</math>
|| <math>f(0,0) = 0</math>
|| <math>-5\le x,y \le 5</math>
|-
| [[Easom function]]
|| [[File:Easom functioncontour.pdfsvg|200px|Easom function]]
|| <math>f(x,y) = -\cos \left(x\right)\cos \left(y\right) \exp\left(-\left(\left(x-\pi\right)^{2} + \left(y-\pi\right)^{2}\right)\right)</math>
|| <math>f(\pi , \pi) = -1</math>
|| <math>-100\le x,y \le 100</math>
|-
| Cross-in-tray function
|| [[File:Cross-in-tray functioncontour.pdfsvg|200px|Cross-in-tray function]]
|| <math>f(x,y) = -0.0001 \left[ \left| \sin x \sin y \exp \left(\left|100 - \frac{\sqrt{x^{2} + y^{2}}}{\pi} \right|\right)\right| + 1 \right]^{0.1}</math>
|| <math>\text{Min} =
\begin{cases}
f\left(1.34941, -1.34941\right) & = -2.06261 \\
Line 128 ⟶ 132:
\end{cases}
</math>
|| <math>-10\le x,y \le 10</math>
|-
| [[Eggholder function]]<ref name="Whitley Rana Dzubera Mathias 1996 pp. 245–276">{{cite journal | last1=Whitley | first1=Darrell | last2=Rana | first2=Soraya | last3=Dzubera | first3=John | last4=Mathias | first4=Keith E. | title=Evaluating evolutionary algorithms | journal=Artificial Intelligence | publisher=Elsevier BV | volume=85 | issue=1–2 | year=1996 | issn=0004-3702 | doi=10.1016/0004-3702(95)00124-7 | pages=264| doi-access=free }}</ref><ref name="vanaret2015hybridation">Vanaret C. (2015) [https://www.researchgate.net/publication/337947149_Hybridization_of_interval_methods_and_evolutionary_algorithms_for_solving_difficult_optimization_problems Hybridization of interval methods and evolutionary algorithms for solving difficult optimization problems.] PhD thesis. Ecole Nationale de l'Aviation Civile. Institut National Polytechnique de Toulouse, France.</ref>
|| [[File:Eggholder functioncontour.pdfsvg|200px|Eggholder function]]
|| <math>f(x,y) = - \left(y+47\right) \sin \sqrt{\left|\frac{x}{2}+\left(y+47\right)\right|} - x \sin \sqrt{\left|x - \left(y + 47 \right)\right|}</math>
|| <math>f(512, 404.2319) = -959.6407</math>
|| <math>-512\le x,y \le 512</math>
|-
| [[Hölder table function]]
|| [[File:HolderHoelder table functioncontour.pdfsvg|200px|Holder table function]]
|| <math>f(x,y) = - \left|\sin x \cos y \exp \left(\left|1 - \frac{\sqrt{x^{2} + y^{2}}}{\pi} \right|\right)\right|</math>
|| <math>\text{Min} =
\begin{cases}
f\left(8.05502, 9.66459\right) & = -19.2085 \\
Line 147 ⟶ 151:
\end{cases}
</math>
|| <math>-10\le x,y \le 10</math>
|-
| [[McCormick function]]
|| [[File:McCormick functioncontour.pdfsvg|200px|McCormick function]]
|| <math>f(x,y) = \sin \left(x+y\right) + \left(x-y\right)^{2} - 1.5x + 2.5y + 1</math>
|| <math>f(-0.54719,-1.54719) = -1.9133</math>
|| <math>-1.5\le x \le 4</math>, <math>-3\le y \le 4</math>
|-
| Schaffer function N. 2
|| [[File:SchafferSchaffer2 function 2contour.pdfsvg|200px|Schaffer function N.2]]
|| <math>f(x,y) = 0.5 + \frac{\sin^{2}\left(x^{2} - y^{2}\right) - 0.5}{\left[1 + 0.001\left(x^{2} + y^{2}\right) \right]^{2}}</math>
|| <math>f(0, 0) = 0</math>
|| <math>-100\le x,y \le 100</math>
|-
| Schaffer function N. 4
|| [[File:SchafferSchaffer4 function 4contour.pdfsvg|200px|Schaffer function N.4]]
|| <math>f(x,y) = 0.5 + \frac{\cos^{2}\left[\sin \left( \left|x^{2} - y^{2}\right|\right)\right] - 0.5}{\left[1 + 0.001\left(x^{2} + y^{2}\right) \right]^{2}}</math>
|| <math>\text{Min} =
\begin{cases}
f\left(0,1.25313\right) & = 0.292579 \\
f\left(0,-1.25313\right) & = 0.292579 \\
f\left(1.25313,0\right) & = 0.292579 \\
f\left(-1.25313,0\right) & = 0.292579
\end{cases}
</math>
|| <math>-100\le x,y \le 100</math>
|-
| [[Styblinski–Tang function]]
|| [[File:Styblinski-Tang functioncontour.pdfsvg|200px|Styblinski-Tang function]]
|| <math>f(\boldsymbol{x}) = \frac{\sum_{i=1}^{n} x_{i}^{4} - 16x_{i}^{2} + 5x_{i}}{2}</math>
|| <math>-39.16617n < f(\underbrace{-2.903534, \ldots, -2.903534}_{n \text{ times}} ) < -39.16616n</math>
|| <math>-5\le x_{i} \le 5</math>, <math>1\le i \le n</math>..
|-
*| [[Shekel function]]
| [[Image:Shekel_2D.jpg|200px|A Shekel function in 2 dimensions and with 10 maxima]]
| <math>
f(\boldsymbol{x}) = \sum_{i = 1}^{m} \; \left( c_{i} + \sum\limits_{j = 1}^{n} (x_{j} - a_{ji})^2 \right)^{-1}
</math>
|-
| <math>-\infty \le x_{i} \le \infty</math>, <math>1 \le i \le n</math>
|}
 
Line 184 ⟶ 198:
|-
! Name !! Plot !! Formula !! Global minimum !! Search ___domain
|-
| Rosenbrock function constrained with a cubic and a line<ref>{{cite conference |author1=Simionescu, P.A. |author2=Beale, D. |title=New Concepts in Graphic Visualization of Objective Functions |conference=ASME 2002 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference |___location=Montreal, Canada |date=September 29 – October 2, 2002|pages=891–897 |url=http://faculty.tamucc.edu/psimionescu/PDFs/DETC02-DAC-34129.pdf |access-date=7 January 2017 }}</ref>
|| [[File:ConstrTestFunc04.png|200px|Rosenbrock function constrained with a cubic and a line]]
|| <math>f(x,y) = (1-x)^2 + 100(y-x^2)^2</math>,
 
subjected to: <math> (x-1)^3 - y + 1 \le 0 \text{ and } x + y - 2 \le 0 </math>
|| <math>f(1.0,1.0) = 0</math>
|| <math>-1.5\le x \le 1.5</math>, <math>-0.5\le y \le 2.5</math>
|-
| Rosenbrock function constrained to a disk<ref>{{Cite web|url=https://www.mathworks.com/help/optim/ug/example-nonlinear-constrained-minimization.html?requestedDomain=www.mathworks.com|title=Solve a Constrained Nonlinear Problem - MATLAB & Simulink|website=www.mathworks.com|access-date=2017-08-29}}</ref>
|| [[File:ConstrTestFunc03Rosenbrock circle constraint.pngsvg|200px|Rosenbrock function constrained to a disk]]
|| <math>f(x,y) = (1-x)^2 + 100(y-x^2)^2</math>,
 
Line 202 ⟶ 208:
|-
| Mishra's Bird function - constrained<ref>{{Cite web|url=http://www.phoenix-int.com/software/benchmark_report/bird_constrained.php|title=Bird Problem (Constrained) {{!}} Phoenix Integration|access-date=2017-08-29|url-status=bot: unknown|archive-url=https://web.archive.org/web/20161229032528/http://www.phoenix-int.com/software/benchmark_report/bird_constrained.php|archive-date=2016-12-29}}</ref><ref>{{Cite journal|last=Mishra|first=Sudhanshu|date=2006|title=Some new test functions for global optimization and performance of repulsive particle swarm method|url=https://mpra.ub.uni-muenchen.de/2718/|journal=MPRA Paper}}</ref>
|| [[File:ConstrTestFunc01Mishra bird contour.pngsvg|200px|Bird function (constrained)]]
|| <math>f(x,y) = \sin(y) e^{\left [(1-\cos x)^2\right]} + \cos(x) e^{\left [(1-\sin y)^2 \right]} + (x-y)^2</math>,
subjected to: <math> (x+5)^2 + (y+5)^2 < 25 </math>
Line 209 ⟶ 215:
|-
| Townsend function (modified)<ref>{{Cite web|url=http://www.chebfun.org/examples/opt/ConstrainedOptimization.html|title=Constrained optimization in Chebfun|last=Townsend|first=Alex|date=January 2014|website=chebfun.org|access-date=2017-08-29}}</ref>
|| [[File:ConstrTestFunc02Townsend contour.pngsvg|200px|Heart constrained multimodal function]]
|| <math>f(x,y) = -[\cos((x-0.1)y)]^2 - x \sin(3x+y)</math>,
subjected to:<math>x^2+y^2 < \left[2\cos t - \frac 1 2 \cos 2t - \frac 1 4 \cos 3t - \frac 1 8 \cos 4t\right]^2 + [2\sin t]^2 </math>
Line 217 ⟶ 223:
 
|-
| '''Keane's bump function'''{{anchor|Keane's bump function}}<ref>{{cite journal |last1=Mishra |first1=Sudhanshu |title=Minimization of Keane’s Bump Function by the Repulsive Particle Swarm and the Differential Evolution Methods |date=5 May 2007 |url=https://econpapers.repec.org/paper/pramprapa/3098.htm |journal=MPRA Paper|publisher=University Library of Munich, Germany}}</ref>
| Gomez and Levy function (modified)<ref>{{cite journal |last1=Simionescu |first1=P.A. |date=2020 |title=A collection of bivariate nonlinear optimisation test problems with graphical representations |url= |journal=International Journal of Mathematical Modelling and Numerical Optimisation |volume=10 |issue=4 |pages=365-398 |doi=10.1504/IJMMNO.2020.110704 |access-date=}}</ref>
|| [[File:GomezEstimation andof LevyDistribution FunctionAlgorithm 1982animation.pnggif|200px|GomezKeane's andbump Levy Functionfunction]]
|| <math>f(\boldsymbol{x,y}) = 4 x^2 -\left| 2.\frac{\left[ \sum_{i=1}^m x\cos^4 +(x_i) - \frac2 \prod_{i=1}^m 3 x\cos^62 +(x_i) xy\right]}{{\left( -\sum_{i=1}^m 4yix^22_i +4 y\right)}^4{0.5}} \right| </math>,
subjected to: <math> 0.75 -\sin(4 \pi x) + 2\sinprod_{i=1}^2(2m \pix_i y)< \le 1.50 </math>, and
|| <math>f(0.08984201,-0.7126564) \sum_{i=1}^m x_i -1 7.0316284535m < 0 </math>
|| <math>-f((1\le.60025376,0.468675907)) x \le= -0.75364979746</math>, <math>-1\le y \le 1</math>
|| <math>f(x,y)0 =< (1-x)^2x_i +< 100(y-x^2)^210</math>,
 
|-
| [[Simionescu function]]<ref>{{cite book|last=Simionescu|first=P.A.|title=Computer Aided Graphing and Simulation Tools for AutoCAD Users|year=2014|publisher=CRC Press|___location=Boca Raton, FL|isbn=978-1-4822-5290-3|edition=1st}}</ref>
|| [[File:Simionescu's function.svg|200px|Simionescu function]]
|| <math>f(x,y) = 0.1xy</math>,
subjected to: <math> x^2+y^2\le\left[r_{T}+r_{S}\cos\left(n \arctan \frac{x}{y} \right)\right]^2</math>
<math>\text{where: } r_{T}=1, r_{S}=0.2 \text{ and } n = 8</math>
|| <math>f(\pm 0.84852813,\mp 0.84852813) = -0.072</math>
|| <math>-1.25\le x,y \le 1.25</math>
|}
 
Line 258 ⟶ 256:
|| <math>0\le x \le 5</math>, <math>0\le y \le 3</math>
|-
| [[Chankong and Haimes function]]:<ref>{{cite book |last1=Chankong |first1=Vira |last2=Haimes |first2=Yacov Y. |title=Multiobjective decision making. Theory and methodology. |isbn=0-444-00710-5|year=1983 |publisher=North Holland }}</ref>
|| [[File:Chakong and Haimes function.pdf|200px|Chakong and Haimes function]]
|| <math>\text{Minimize} =
Line 360 ⟶ 358:
||<math>-\pi\le x,y \le \pi</math>
|-
| Zitzler–Deb–Thiele's function N. 1:<ref name="Debetal2002testpr">{{cite book |last1=Deb |first1=Kalyan |last2=Thiele |first2=L. |last3=Laumanns |first3=Marco |last4=Zitzler |first4=Eckart |chapter=Scalable multi-objective optimization test problems |title=Proceedings of the 2002 IEEE Congress on Evolutionary Computation. CEC'02 (Cat. No.02TH8600) |chapter=Scalable multi-objective optimization test problems |date=2002 |volume=1 |pages=825–830 |doi=10.1109/CEC.2002.1007032|isbn=0-7803-7282-4 |s2cid=61001583 }}</ref>
|| [[File:Zitzler-Deb-Thiele's function 1.pdf|200px|Zitzler-Deb-Thiele's function N.1]]
|| <math>\text{Minimize} =
Line 446 ⟶ 444:
|| <math>0\le x_{1},x_{2},x_{6} \le 10</math>, <math>1\le x_{3},x_{5} \le 5</math>, <math>0\le x_{4} \le 6</math>.
|-
| CTP1 function (2 variables):<ref name="Deb:2002"/><ref name="Jimenezetal2002">{{cite journalbook |last1=Jimenez |first1=F. |last2=Gomez-Skarmeta |first2=A. F. |last3=Sanchez |first3=G. |last4=Deb |first4=K. |title=An evolutionary algorithm for constrained multi-objective optimization |journal=Proceedings of the 2002 Congress on Evolutionary Computation. CEC'02 (Cat. No.02TH8600) |chapter=An evolutionary algorithm for constrained multi-objective optimization |date=May 2002 |volume=2 |pages=1133–1138 |doi=10.1109/CEC.2002.1004402|isbn=0-7803-7282-4 |s2cid=56563996 }}</ref>
|| [[File:CTP1 function (2 variables).pdf|200px|CTP1 function (2 variables).<ref name="Deb:2002" />]]
|| <math>\text{Minimize} =
Line 490 ⟶ 488:
||<math>-3\le x,y \le 3</math>.
|}
 
== See also ==
{{commons category|Test functions (mathematical optimization)}}
* [[Ackley function]]
* [[Himmelblau's function]]
* [[Rastrigin function]]
* [[Rosenbrock function]]
* [[Shekel function]]
 
==References==
<references/>
 
== External links ==
<references/>
* [https://github.com/nathanrooy/landscapes landscapes]
 
{{DEFAULTSORT:Test functions for optimization}}
[[Category:Constraint programming]]
[[Category:Convex optimization]]
[[Category:Types ofTest functions for optimization| ]]
[[Category:Test items]]