Estimation of distribution algorithm: Difference between revisions

Content deleted Content added
Adding short description: "Family of stochastic optimization methods"
Link suggestions feature: 3 links added.
 
(5 intermediate revisions by 3 users not shown)
Line 33:
 
===Univariate marginal distribution algorithm (UMDA)===
The UMDA<ref>{{cite journal|last1=Mühlenbein|first1=Heinz|title=The Equation for Response to Selection and Its Use for Prediction|journal=Evol. Computation|date=1 September 1997|volume=5|issue=3|pages=303–346|doi=10.1162/evco.1997.5.3.303|pmid=10021762|s2cid=2593514 |url=http://dl.acm.org/citation.cfm?id=1326756|issn=1063-6560|url-access=subscription}}</ref> is a simple EDA that uses an operator <math>\alpha_{UMDA}</math> to estimate marginal probabilities from a selected population <math>S(P(t))</math>. By assuming <math>S(P(t))</math> contain <math>\lambda</math> elements, <math>\alpha_{UMDA}</math> produces probabilities:
 
<math>
Line 81:
 
===Mutual information maximizing input clustering (MIMIC)===
The MIMIC<ref>{{cite journal|last1=Bonet|first1=Jeremy S. De|last2=Isbell|first2=Charles L.|last3=Viola|first3=Paul|title=MIMIC: Finding Optima by Estimating Probability Densities|journal=Advances in Neural Information Processing Systems|date=1 January 1996|pages=424|citeseerx=10.1.1.47.6497}}</ref> factorizes the [[joint probability distribution]] in a chain-like model representing successive dependencies between variables. It finds a permutation of the decision variables, <math>r : i \mapsto j</math>, such that <math>x_{r(1)}x_{r(2)},\dots,x_{r(N)}</math> minimizes the [[Kullback-LeiblerKullback–Leibler divergence]] in relation to the true probability distribution, i.e. <math>\pi_{r(i+1)} = \{X_{r(i)}\}</math>. MIMIC models a distribution
 
<math>
Line 130:
</math>
 
The CPC, on the other hand, quantifies the data compression in terms of entropy of the [[marginal distribution]] over all partitions, where <math>\lambda</math> is the selected population size, <math>|\tau|</math> is the number of decision variables in the linkage set <math>\tau</math> and <math>H(\tau)</math> is the [[joint entropy]] of the variables in <math>\tau</math>
 
<math>
Line 149:
</math>
 
The Bayesian network structure, on the other hand, must be built iteratively (linkage-learning). It starts with a network without edges and, at each step, adds the edge which better improves some scoring metric (e.g. [[Bayesian information criterion]] (BIC) or Bayesian-Dirichlet metric with likelihood equivalence (BDe)).<ref>{{cite journal|last1=Larrañaga|first1=Pedro|last2=Karshenas|first2=Hossein|last3=Bielza|first3=Concha|last4=Santana|first4=Roberto|title=A review on probabilistic graphical models in evolutionary computation|journal=Journal of Heuristics|date=21 August 2012|volume=18|issue=5|pages=795–819|doi=10.1007/s10732-012-9208-4|s2cid=9734434 |url=http://oa.upm.es/15826/}}</ref> The scoring metric evaluates the network structure according to its accuracy in modeling the selected population. From the built network, BOA samples new promising solutions as follows: (1) it computes the ancestral ordering for each variable, each node being preceded by its parents; (2) each variable is sampled conditionally to its parents. Given such scenario, every BOA step can be defined as
 
<math>
Line 193:
* Real-coded PBIL{{Citation needed|date=June 2018}}
* Selfish Gene Algorithm (SG)<ref>{{Cite book|last1=Corno|first1=Fulvio|last2=Reorda|first2=Matteo Sonza|last3=Squillero|first3=Giovanni|date=1998-02-27|title=The selfish gene algorithm: a new evolutionary optimization strategy|publisher=ACM|pages=349–355|doi=10.1145/330560.330838|isbn=978-0897919692|s2cid=9125252 }}</ref>
* Compact Differential Evolution (cDE)<ref>{{Cite journal|last1=Mininno|first1=Ernesto|last2=Neri|first2=Ferrante|last3=Cupertino|first3=Francesco|last4=Naso|first4=David|date=2011|title=Compact Differential Evolution|journal=IEEE Transactions on Evolutionary Computation|language=en-US|volume=15|issue=1|pages=32–54|doi=10.1109/tevc.2010.2058120|s2cid=20582233 |issn=1089-778X}}</ref> and its variants<ref>{{Cite journal|last1=Iacca|first1=Giovanni|last2=Caraffini|first2=Fabio|last3=Neri|first3=Ferrante|date=2012|title=Compact Differential Evolution Light: High Performance Despite Limited Memory Requirement and Modest Computational Overhead|journal=Journal of Computer Science and Technology|language=en|volume=27|issue=5|pages=1056–1076|doi=10.1007/s11390-012-1284-2|s2cid=3184035 |issn=1000-9000|hdl=2086/11740|hdl-access=free}}</ref><ref>{{Citation|last1=Iacca|first1=Giovanni|title=Opposition-Based Learning in Compact Differential Evolution|date=2011|last2=Neri|first2=Ferrante|last3=Mininno|first3=Ernesto|work=Applications of Evolutionary Computation|pages=264–273|publisher=Springer Berlin Heidelberg|language=en|doi=10.1007/978-3-642-20525-5_27|isbn=9783642205248|hdl=11572/196440|hdl-access=free}}</ref><ref>{{Cite book|last1=Mallipeddi|first1=Rammohan|last2=Iacca|first2=Giovanni|last3=Suganthan|first3=Ponnuthurai Nagaratnam|last4=Neri|first4=Ferrante|last5=Mininno|first5=Ernesto|title=2011 IEEE Congress of Evolutionary Computation (CEC) |chapter=Ensemble strategies in Compact Differential Evolution |date=2011|pages=1972–1977 |language=en-US|publisher=IEEE|doi=10.1109/cec.2011.5949857|isbn=9781424478347|s2cid=11781300 }}</ref><ref>{{Cite journal|last1=Neri|first1=Ferrante|last2=Iacca|first2=Giovanni|last3=Mininno|first3=Ernesto|date=2011|title=Disturbed Exploitation compact Differential Evolution for limited memory optimization problems|journal=Information Sciences|volume=181|issue=12|pages=2469–2487|doi=10.1016/j.ins.2011.02.004|issn=0020-0255}}</ref><ref>{{Cite book|last1=Iacca|first1=Giovanni|last2=Mallipeddi|first2=Rammohan|last3=Mininno|first3=Ernesto|last4=Neri|first4=Ferrante|last5=Suganthan|first5=Pannuthurai Nagaratnam|title=2011 IEEE Symposium on Differential Evolution (SDE) |chapter=Global supervision for compact Differential Evolution |date=2011|pages=1–8 |language=en-US|publisher=IEEE|doi=10.1109/sde.2011.5952051|isbn=9781612840710|s2cid=8874851 }}</ref><ref>{{Cite book|last1=Iacca|first1=Giovanni|last2=Mallipeddi|first2=Rammohan|last3=Mininno|first3=Ernesto|last4=Neri|first4=Ferrante|last5=Suganthan|first5=Pannuthurai Nagaratnam|title=2011 IEEE Workshop on Memetic Computing (MC) |chapter=Super-fit and population size reduction in compact Differential Evolution |date=2011|pages=1–8 |language=en-US|publisher=IEEE|doi=10.1109/mc.2011.5953633|isbn=9781612840659|s2cid=5692951 }}</ref>
* Compact Particle Swarm Optimization (cPSO)<ref>{{Cite journal|last1=Neri|first1=Ferrante|last2=Mininno|first2=Ernesto|last3=Iacca|first3=Giovanni|date=2013|title=Compact Particle Swarm Optimization|journal=Information Sciences|volume=239|pages=96–121|doi=10.1016/j.ins.2013.03.026|issn=0020-0255}}</ref>
* Compact Bacterial Foraging Optimization (cBFO)<ref>{{Citation|last1=Iacca|first1=Giovanni|title=Compact Bacterial Foraging Optimization|date=2012|last2=Neri|first2=Ferrante|last3=Mininno|first3=Ernesto|work=Swarm and Evolutionary Computation|pages=84–92|publisher=Springer Berlin Heidelberg|language=en|doi=10.1007/978-3-642-29353-5_10|hdl=11572/196442 |isbn=9783642293528|hdl-access=free}}</ref>
* Probabilistic incremental program evolution (PIPE)<ref>{{Cite journal|last1=Salustowicz|first1=null|last2=Schmidhuber|first2=null|date=1997|title=Probabilistic incremental program evolution|journal=Evolutionary Computation|volume=5|issue=2|pages=123–141|issn=1530-9304|pmid=10021756|doi=10.1162/evco.1997.5.2.123|s2cid=10759266 |url=http://depositonce.tu-berlin.de/handle/11303/1046}}</ref>
* Estimation of Gaussian networks algorithm (EGNA){{Citation needed|date=June 2018}}
Line 204:
* [[CMA-ES]]
* [[Cross-entropy method]]
* [[Ant colony optimization algorithms]]
 
==References==