Bayesian optimization: Difference between revisions

Content deleted Content added
Line 34:
There are several methods used to define the prior/posterior distribution over the objective function. The most common two methods use [[Gaussian process]]es in a method called [[kriging]]. Another less expensive method uses the [[Parzen-Tree Estimator]] to construct two distributions for 'high' and 'low' points, and then finds the ___location that maximizes the expected improvement.<ref>J. S. Bergstra, R. Bardenet, Y. Bengio, B. Kégl: [http://papers.nips.cc/paper/4443-algorithms-for-hyper-parameter-optimization.pdf Algorithms for Hyper-Parameter Optimization]. Advances in Neural Information Processing Systems: 2546–2554 (2011)</ref>
 
Standard Bayesian optimization relies upon each <math>x \in AX</math> being easy to evaluate, and problems that deviate from this assumption are known as ''exotic Bayesian optimization'' problems. Optimization problems can become exotic if it is known that there is noise, the evaluations are being done in parallel, the quality of evaluations relies upon a tradeoff between difficulty and accuracy, the presence of random environmental conditions, or if the evaluation involves derivatives.<ref name=":0" />
 
==Acquisition functions==