Evolutionary multimodal optimization: Difference between revisions

Content deleted Content added
add recent works of EMO
Fixed a reference and deleted bolds. Please see Category:CS1 errors: dates and MOS:BOLD.
Line 1:
{{Evolutionary algorithms}}
In [[applied mathematics]], '''multimodal optimization''' deals with [[Mathematical optimization|optimization]] tasks that involve finding all or most of the multiple (at least locally optimal) solutions of a problem, as opposed to a single best solution. Evolutionary multimodal optimization is a branch of [[evolutionary computation]], which is closely related to [[machine learning]]. Wong provides a short survey,<ref>Wong, K. C. (2015), [https://arxiv.org/abs/1508.00457 Evolutionary Multimodal Optimization: A Short Survey] arXiv preprint arXiv:1508.00457</ref> wherein the chapter of Shir<ref>Shir, O.M. (2012), [https://link.springer.com/book/10.1007/978-3-540-92910-9 Niching in Evolutionary Algorithms] {{Webarchive|url=https://web.archive.org/web/20160304110426/http://link.springer.com/book/10.1007/978-3-540-92910-9 |date=2016-03-04 }}</ref> and the book of Preuss<ref>Preuss, Mike (2015), [https://www.springer.com/de/book/9783319074061 Multimodal Optimization by Means of Evolutionary Algorithms]</ref> cover the topic in more detail.
Evolutionary multimodal optimization is a branch of [[evolutionary computation]], which is closely related to [[machine learning]]. Wong provides a short survey,<ref>Wong, K. C. (2015), [https://arxiv.org/abs/1508.00457 Evolutionary Multimodal Optimization: A Short Survey] arXiv preprint arXiv:1508.00457</ref> wherein the chapter of Shir<ref>Shir, O.M. (2012), [https://link.springer.com/book/10.1007/978-3-540-92910-9 Niching in Evolutionary Algorithms] {{Webarchive|url=https://web.archive.org/web/20160304110426/http://link.springer.com/book/10.1007/978-3-540-92910-9 |date=2016-03-04 }}</ref> and the book of Preuss<ref>Preuss, Mike (2015), [https://www.springer.com/de/book/9783319074061 Multimodal Optimization by Means of Evolutionary Algorithms]</ref> cover the topic in more detail.
 
== Motivation ==
Knowledge of multiple solutions to an optimization task is especially helpful in engineering, when due to physical (and/or cost) constraints, the best results may not always be realizable. In such a scenario, if multiple solutions (locally and/or globally optimal) are known, the implementation can be quickly switched to another solution and still obtain the best possible system performance. Multiple solutions could also be analyzed to discover hidden properties (or relationships) of the underlying optimization problem, which makes them important for obtaining [[___domain knowledge]]. In addition, the algorithms for multimodal optimization usually not only locate multiple optima in a single run, but also preserve their population diversity, resulting in their global optimization ability on multimodal functions. Moreover, the techniques for multimodal optimization are usually borrowed as diversity maintenance techniques to other problems.<ref>Wong, K. C. et al. (2012), [https://dx.doi.org/10.1016/j.ins.2011.12.016 Evolutionary multimodal optimization using the principle of locality] Information Sciences</ref><ref>{{Cite journal |last=Jiang |first=Yi |last2=Zhan |first2=Zhi-Hui |last3=Tan |first3=Kay Chen |last4=Zhang |first4=Jun |date=April 2023 |title=Optimizing Niche Center for Multimodal Optimization Problems |url=https://ieeexplore.ieee.org/document/9655447/ |journal=IEEE Transactions on Cybernetics |volume=53 |issue=4 |pages=2544–2557 |doi=10.1109/TCYB.2021.3125362 |issn=2168-2267}}</ref>
In addition, the algorithms for multimodal optimization usually not only locate multiple optima in a single run, but also preserve their population diversity, resulting in their global optimization ability on multimodal functions. Moreover, the techniques for multimodal optimization are usually borrowed as diversity maintenance techniques to other problems.<ref>Wong, K. C. et al. (2012), [https://dx.doi.org/10.1016/j.ins.2011.12.016 Evolutionary multimodal optimization using the principle of locality] Information Sciences</ref><ref>{{Cite journal |last=Jiang |first=Yi |last2=Zhan |first2=Zhi-Hui |last3=Tan |first3=Kay Chen |last4=Zhang |first4=Jun |date=2023-04 |title=Optimizing Niche Center for Multimodal Optimization Problems |url=https://ieeexplore.ieee.org/document/9655447/ |journal=IEEE Transactions on Cybernetics |volume=53 |issue=4 |pages=2544–2557 |doi=10.1109/TCYB.2021.3125362 |issn=2168-2267}}</ref>
 
== Background ==
 
Classical techniques of optimization would need multiple restart points and multiple runs in the hope that a different solution may be discovered every run, with no guarantee however. [[Evolutionary algorithm]]s (EAs) due to their population based approach, provide a natural advantage over classical optimization techniques. They maintain a population of possible solutions, which are processed every generation, and if the multiple solutions can be preserved over all these generations, then at termination of the algorithm we will have multiple good solutions, rather than only the best solution. Note that this is against the natural tendency of classical optimization techniques, which will always converge to the best solution, or a sub-optimal solution (in a rugged, “badly behaving” function).''' Finding''' and '''maintenance''' of multiple solutions is wherein lies the challenge of using EAs for multi-modal optimization. '''Niching'''<ref>Mahfoud, S. W. (1995), "[http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.30.8270&rep=rep1&type=pdf Niching methods for genetic algorithms]"</ref> is a generic term referred to as the technique of finding and preserving multiple stable ''niches'', or favorable parts of the solution space possibly around multiple solutions, so as to prevent convergence to a single solution.
 
The field of [[Evolutionary algorithm]]s encompasses [[genetic algorithm]]s (GAs), [[evolution strategy]] (ES), [[differential evolution]] (DE), [[particle swarm optimization]] (PSO), and other methods. Attempts have been made to solve multi-modal optimization in all these realms and most, if not all the various methods implement niching in some form or the other.
Line 18 ⟶ 16:
 
The application of multimodal optimization within ES was not explicit for many years, and has been explored only recently.
A niching framework utilizing derandomized ES was introduced by Shir,<ref>Shir, O.M. (2008), "[https://openaccess.leidenuniv.nl/handle/1887/12981 Niching in Derandomized Evolution Strategies and its Applications in Quantum Control]"</ref> '''proposing the [[CMA-ES]] as a niching optimizer for the first time'''. The underpinning of that framework was the selection of a peak individual per subpopulation in each generation, followed by its sampling to produce the consecutive dispersion of search-points. The ''biological analogy'' of this machinery is an ''alpha-male'' winning all the imposed competitions and dominating thereafter its ''ecological niche'', which then obtains all the sexual resources therein to generate its offspring.
 
Recently, an evolutionary [[multiobjective optimization]] (EMO) approach was proposed,<ref>Deb, K., Saha, A. (2010) "[https://dl.acm.org/doi/pdf/10.1145/1830483.1830568 Finding Multiple Solutions for Multimodal Optimization Problems Using a Multi-Objective Evolutionary Approach]" (GECCO 2010, In press)</ref> in which a suitable second objective is added to the originally single objective multimodal optimization problem, so that the multiple solutions form a '' weak pareto-optimal'' front. Hence, the multimodal optimization problem can be solved for its multiple solutions using an EMO algorithm. Improving upon their work,<ref>Saha, A., Deb, K. (2010) "A Bi-criterion Approach to Multimodal Optimization: Self-adaptive Approach " (Lecture Notes in Computer Science, 2010, Volume 6457/2010, 95–104)</ref> the same authors have made their algorithm self-adaptive, thus eliminating the need for pre-specifying the parameters.