Content deleted Content added
Reverted to revision 878794178 by JCW-CleanerBot (talk): Rv citespam (TW) |
→Software implementations: Add CEopt Matlab package to the list. |
||
(20 intermediate revisions by 12 users not shown) | |||
Line 1:
{{Short description|Monte Carlo method for importance sampling and optimization}}
The '''cross-entropy''' ('''CE''') '''method''' is a [[Monte Carlo method|Monte Carlo]] method for [[importance sampling]] and [[Optimization (mathematics)|optimization]]. It is applicable to both [[Combinatorial optimization|combinatorial]] and [[Continuous optimization|continuous]] problems, with either a static or noisy objective.
The method approximates the optimal importance sampling estimator by repeating two phases:<ref>Rubinstein, R.Y. and Kroese, D.P. (2004), The Cross-Entropy Method: A Unified Approach to Combinatorial Optimization, Monte-Carlo Simulation, and Machine Learning, Springer-Verlag, New York {{ISBN|978-0-387-21240-1}}.</ref>
#Draw a sample from a [[probability distribution]].
#Minimize the ''[[
[[Reuven Rubinstein]] developed the method in the context of ''rare
==Estimation via importance sampling==
Line 26 ⟶ 27:
# Choose initial parameter vector <math>\mathbf{v}^{(0)}</math>; set t = 1.
# Generate a random sample <math>\mathbf{X}_1,\dots,\mathbf{X}_N</math> from <math>f(\cdot;\mathbf{v}^{(t-1)})</math>
# Solve for <math>\mathbf{v}^{(t)}</math>, where<br><math>\mathbf{v}^{(t)} = \mathop{\textrm{argmax}}_{\mathbf{
# If convergence is reached then '''stop'''; otherwise, increase t by 1 and reiterate from step 2.
Line 36 ⟶ 37:
== Continuous optimization—example==
The same CE algorithm can be used for optimization, rather than estimation.
Suppose the problem is to maximize some function <math>S
<math>S(x) = \textrm{e}^{-(x-2)^2} + 0.8\,\textrm{e}^{-(x+2)^2}</math>.
To apply CE, one considers first the ''associated stochastic problem'' of estimating
Line 51 ⟶ 52:
This yields the following randomized algorithm that happens to coincide with the so-called Estimation of Multivariate Normal Algorithm (EMNA), an [[estimation of distribution algorithm]].
===
''// Initialize parameters''
μ :=
t := 0
maxits := 100
N := 100
Ne := 10
''// While maxits not exceeded and not converged''
'''while''' t < maxits '''and'''
''// Obtain N samples from current sampling distribution''
X := SampleGaussian(μ,
''// Evaluate objective function at sampled points''
S := exp(
''// Sort X by objective function values in descending order''
X := sort(X, S)
''// Update parameters of sampling distribution via elite samples''
μ := mean(X(1:Ne))
t := t + 1
''// Return mean of final sampling distribution as solution''
'''return''' μ
==Related methods==
* [[Simulated annealing]]
* [[Genetic algorithms]]
* [[Harmony search]]
* [[Estimation of distribution algorithm]]
* [[Tabu search]]
* [[Natural Evolution Strategy]]
* [[Ant colony optimization algorithms]]
==See also==
* [[Cross entropy]]
* [[Kullback–Leibler divergence]]
* [[Randomized algorithm]]
* [[Importance sampling]]
== Journal
* De Boer, P.-T., Kroese, D.P., Mannor, S. and Rubinstein, R.Y. (2005). A Tutorial on the Cross-Entropy Method. ''Annals of Operations Research'', '''134''' (1), 19–67.[http://www.maths.uq.edu.au/~kroese/ps/aortut.pdf]
*Rubinstein, R.Y. (1997). Optimization of Computer
==Software
* [https://ceopt.org '''CEopt''' Matlab package]
* [https://cran.r-project.org/web/packages/CEoptim/index.html '''CEoptim''' R package] * [https://www.nuget.org/packages/Novacta.Analytics '''Novacta.Analytics''' .NET library]
==References==
|