Bayesian optimization: Difference between revisions

Content deleted Content added
WikiCleanerBot (talk | contribs)
m v2.05b - Bot T20 CW#61 - Fix errors for CW project (Reference before punctuation)
Jonas Mockus was a scientist, born in the independent Lithuanian state in 1931<ref>https://lt.wikipedia.org/wiki/Jonas_Mockus<ref>
Line 19:
The earliest idea of Bayesian optimization <ref>{{Cite book |last=GARNETT |first=ROMAN |title=BAYESIAN OPTIMIZATION |date=2023 |publisher=Cambridge University Press |isbn=978-1-108-42578-0 |edition=First published 2023}}</ref>sprang in 1964, from a paper by American applied mathematician Harold J. Kushner,<ref>{{Cite web|url=https://vivo.brown.edu/display/hkushner|title=Kushner, Harold|website=vivo.brown.edu}}</ref> [https://asmedigitalcollection.asme.org/fluidsengineering/article/86/1/97/392213/A-New-Method-of-Locating-the-Maximum-Point-of-an “A New Method of Locating the Maximum Point of an Arbitrary Multipeak Curve in the Presence of Noise”]. Although not directly proposing Bayesian optimization, in this paper, he first proposed a new method of locating the maximum point of an arbitrary multipeak curve in a noisy environment. This method provided an important theoretical foundation for subsequent Bayesian optimization.
 
By the 1980s, the framework we now use for Bayesian optimization was explicitly established. In 1978, the SovietLithuanian scientist Jonas Mockus,<ref>{{Cite web |title=Jonas Mockus |url=https://en.ktu.edu/people/jonas-mockus/ |access-date=2025-03-06 |website=Kaunas University of Technology |language=en}}</ref> in his paper “The Application of Bayesian Methods for Seeking the Extremum”, discussed how to use Bayesian methods to find the extreme value of a function under various uncertain conditions. In his paper, Mockus first proposed the [https://schneppat.com/expected-improvement_ei.html Expected Improvement principle (EI)], which is one of the core sampling strategies of Bayesian optimization. This criterion balances exploration while optimizing the function efficiently by maximizing the expected improvement. Because of the usefulness and profound impact of this principle, Jonas Mockus is widely regarded as the founder of Bayesian optimization. Although Expected Improvement principle (IE) is one of the earliest proposed core sampling strategies for Bayesian optimization, it is not the only one, with the development of modern society, we also have Probability of Improvement (PI), or Upper Confidence Bound (UCB)<ref>{{Cite journal |last=Kaufmann |first=Emilie |last2=Cappe |first2=Olivier |last3=Garivier |first3=Aurelien |date=2012-03-21 |title=On Bayesian Upper Confidence Bounds for Bandit Problems |url=https://proceedings.mlr.press/v22/kaufmann12.html |journal=Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics |language=en |publisher=PMLR |pages=592–600}}</ref> and so on.
 
==== From Theory to Practice ====