===Univariate marginal distribution algorithm (UMDA)===
The UMDA<ref>{{cite journal|last1=Mühlenbein|first1=Heinz|title=The Equation for Response to Selection and Its Use for Prediction|journal=Evol. Computation|date=1 September 1997|volume=5|issue=3|pages=303–346|doi=10.1162/evco.1997.5.3.303|pmid=10021762|s2cid=2593514 |url=http://dl.acm.org/citation.cfm?id=1326756|issn=1063-6560}}</ref> is a simple EDA that uses an operator <math>\alpha_{UMDA}</math> to estimate marginal probabilities from a selected population <math>S(P(t))</math>. By assuming <math>S(P(t))</math> contain <math>\lambda</math> elements, <math>\alpha_{UMDA}</math> produces probabilities:
<math>
===Extended compact genetic algorithm (eCGA)===
The ECGA<ref>{{cite bookthesis|last1=Harik|first1=Georges Raif|title=Learning Gene Linkage to Efficiently Solve Problems of Bounded Difficulty Using Genetic Algorithms|publisher=University of Michigan|url=http://dl.acm.org/citation.cfm?id=269517|year=1997|type=phd }}</ref> was one of the first EDA to employ multivariate factorizations, in which high-order dependencies among decision variables can be modeled. Its approach factorizes the joint probability distribution in the product of multivariate marginal distributions. Assume <math>T_\text{eCGA}=\{\tau_1,\dots,\tau_\Psi\}</math> is a set of subsets, in which every <math>\tau\in T_\text{eCGA}</math> is a linkage set, containing <math>|\tau|\leq K</math> variables. The factorized joint probability distribution is represented as follows
<math>
</math>
The Bayesian network structure, on the other hand, must be built iteratively (linkage-learning). It starts with a network without edges and, at each step, adds the edge which better improves some scoring metric (e.g. Bayesian information criterion (BIC) or Bayesian-Dirichlet metric with likelihood equivalence (BDe)).<ref>{{cite journal|last1=Larrañaga|first1=Pedro|last2=Karshenas|first2=Hossein|last3=Bielza|first3=Concha|last4=Santana|first4=Roberto|title=A review on probabilistic graphical models in evolutionary computation|journal=Journal of Heuristics|date=21 August 2012|volume=18|issue=5|pages=795–819|doi=10.1007/s10732-012-9208-4|s2cid=9734434 |url=http://oa.upm.es/15826/}}</ref> The scoring metric evaluates the network structure according to its accuracy in modeling the selected population. From the built network, BOA samples new promising solutions as follows: (1) it computes the ancestral ordering for each variable, each node being preceded by its parents; (2) each variable is sampled conditionally to its parents. Given such scenario, every BOA step can be defined as
<math>
===Linkage-tree Genetic Algorithm (LTGA)===
The LTGA<ref>{{cite book|last1=Thierens|first1=Dirk|titlechapter=The Linkage Tree Genetic Algorithm|journal=Parallel Problem Solving from Nature, PPSN XI|date=11 September 2010|pages=264–273|doi=10.1007/978-3-642-15844-5_27|isbn=978-3-642-15843-8}}</ref> differs from most EDA in the sense it does not explicitly model a probability distribution but only a linkage model, called linkage-tree. A linkage <math>T</math> is a set of linkage sets with no probability distribution associated, therefore, there is no way to sample new solutions directly from <math>T</math>. The linkage model is a linkage-tree produced stored as a [[Family of sets]] (FOS).
<math>
==Other==
* Probability collectives (PC)<ref>{{cite journal|last1=WOLPERT|first1=DAVID H.|title=Advances in Distributed Optimization Using Probability Collectives|last2=STRAUSS|first2=CHARLIE E. M.|last3=RAJNARAYAN|first3=DEV|journal=Advances in Complex Systems|date=December 2006|volume=09|issue=4|pages=383–436|doi=10.1142/S0219525906000884|citeseerx=10.1.1.154.6395}}</ref><ref>{{cite journal|last1=Pelikan|first1=Martin|last2=Goldberg|first2=David E.|last3=Lobo|first3=Fernando G.|title=A Survey of Optimization by Building and Using Probabilistic Models|journal=Computational Optimization and Applications|date=2002|volume=21|issue=1|pages=5–20|doi=10.1023/A:1013500812258}}</ref>
* Hill climbing with learning (HCwL)<ref>{{Cite journal|lastlast1=Rudlof|firstfirst1=Stephan|last2=Köppen|first2=Mario|date=1997|title=Stochastic Hill Climbing with Learning by Vectors of Normal Distributions|pages=60–70 |citeseerx=10.1.1.19.3536|language=en}}</ref>
* Estimation of multivariate normal algorithm (EMNA){{Citation needed|date=June 2018}}
* Estimation of Bayesian networks algorithm (EBNA){{Citation needed|date=June 2018}}
* Stochastic hill climbing with learning by vectors of normal distributions (SHCLVND)<ref>{{Cite journal|lastlast1=Rudlof|firstfirst1=Stephan|last2=Köppen|first2=Mario|date=1997|title=Stochastic Hill Climbing with Learning by Vectors of Normal Distributions|pages=60––70|citeseerx=10.1.1.19.3536}}</ref>
* Real-coded PBIL{{Citation needed|date=June 2018}}
* Selfish Gene Algorithm (SG)<ref>{{Cite book|lastlast1=Corno|firstfirst1=Fulvio|last2=Reorda|first2=Matteo Sonza|last3=Squillero|first3=Giovanni|date=1998-02-27|title=The selfish gene algorithm: a new evolutionary optimization strategy|publisher=ACM|pages=349–355|doi=10.1145/330560.330838|isbn=978-0897919692|s2cid=9125252 }}</ref>
* Compact Differential Evolution (cDE)<ref>{{Cite journal|lastlast1=Mininno|firstfirst1=Ernesto|last2=Neri|first2=Ferrante|last3=Cupertino|first3=Francesco|last4=Naso|first4=David|date=2011|title=Compact Differential Evolution|journal=IEEE Transactions on Evolutionary Computation|language=en-US|volume=15|issue=1|pages=32–54|doi=10.1109/tevc.2010.2058120|s2cid=20582233 |issn=1089-778X}}</ref> and its variants<ref>{{Cite journal|lastlast1=Iacca|firstfirst1=Giovanni|last2=Caraffini|first2=Fabio|last3=Neri|first3=Ferrante|date=2012|title=Compact Differential Evolution Light: High Performance Despite Limited Memory Requirement and Modest Computational Overhead|journal=Journal of Computer Science and Technology|language=en|volume=27|issue=5|pages=1056–1076|doi=10.1007/s11390-012-1284-2|s2cid=3184035 |issn=1000-9000}}</ref><ref>{{Citation|lastlast1=Iacca|firstfirst1=Giovanni|title=Opposition-Based Learning in Compact Differential Evolution|date=2011|last2=Neri|first2=Ferrante|last3=Mininno|first3=Ernesto|work=Applications of Evolutionary Computation|pages=264–273|publisher=Springer Berlin Heidelberg|language=en|doi=10.1007/978-3-642-20525-5_27|isbn=9783642205248}}</ref><ref>{{Cite book|lastlast1=Mallipeddi|firstfirst1=Rammohan|last2=Iacca|first2=Giovanni|last3=Suganthan|first3=Ponnuthurai Nagaratnam|last4=Neri|first4=Ferrante|last5=Mininno|first5=Ernesto|date=2011|title=Ensemble strategies in Compact Differential Evolution|journal=2011 IEEE Congress of Evolutionary Computation (CEC)|language=en-US|publisher=IEEE|doi=10.1109/cec.2011.5949857|isbn=9781424478347|s2cid=11781300 }}</ref><ref>{{Cite journal|lastlast1=Neri|firstfirst1=Ferrante|last2=Iacca|first2=Giovanni|last3=Mininno|first3=Ernesto|date=2011|title=Disturbed Exploitation compact Differential Evolution for limited memory optimization problems|journal=Information Sciences|volume=181|issue=12|pages=2469–2487|doi=10.1016/j.ins.2011.02.004|issn=0020-0255}}</ref><ref>{{Cite book|lastlast1=Iacca|firstfirst1=Giovanni|last2=Mallipeddi|first2=Rammohan|last3=Mininno|first3=Ernesto|last4=Neri|first4=Ferrante|last5=Suganthan|first5=Pannuthurai Nagaratnam|date=2011|title=Global supervision for compact Differential Evolution|journal=2011 IEEE Symposium on Differential Evolution (SDE)|language=en-US|publisher=IEEE|doi=10.1109/sde.2011.5952051|isbn=9781612840710|s2cid=8874851 }}</ref><ref>{{Cite book|lastlast1=Iacca|firstfirst1=Giovanni|last2=Mallipeddi|first2=Rammohan|last3=Mininno|first3=Ernesto|last4=Neri|first4=Ferrante|last5=Suganthan|first5=Pannuthurai Nagaratnam|date=2011|title=Super-fit and population size reduction in compact Differential Evolution|journal=2011 IEEE Workshop on Memetic Computing (MC)|language=en-US|publisher=IEEE|doi=10.1109/mc.2011.5953633|isbn=9781612840659|s2cid=5692951 }}</ref>
* Compact Particle Swarm Optimization (cPSO)<ref>{{Cite journal|lastlast1=Neri|firstfirst1=Ferrante|last2=Mininno|first2=Ernesto|last3=Iacca|first3=Giovanni|date=2013|title=Compact Particle Swarm Optimization|journal=Information Sciences|volume=239|pages=96–121|doi=10.1016/j.ins.2013.03.026|issn=0020-0255}}</ref>
* Compact Bacterial Foraging Optimization (cBFO)<ref>{{Citation|lastlast1=Iacca|firstfirst1=Giovanni|title=Compact Bacterial Foraging Optimization|date=2012|last2=Neri|first2=Ferrante|last3=Mininno|first3=Ernesto|work=Swarm and Evolutionary Computation|pages=84–92|publisher=Springer Berlin Heidelberg|language=en|doi=10.1007/978-3-642-29353-5_10|isbn=9783642293528}}</ref>
* Probabilistic incremental program evolution (PIPE)<ref>{{Cite journal|lastlast1=Salustowicz|firstfirst1=null|last2=Schmidhuber|first2=null|date=1997|title=Probabilistic incremental program evolution|journal=Evolutionary Computation|volume=5|issue=2|pages=123–141|issn=1530-9304|pmid=10021756|doi=10.1162/evco.1997.5.2.123|s2cid=10759266 |url=http://depositonce.tu-berlin.de/handle/11303/1046}}</ref>
* Estimation of Gaussian networks algorithm (EGNA){{Citation needed|date=June 2018}}
* Estimation multivariate normal algorithm with thresheld convergence<ref>{{Cite book|lastlast1=Tamayo-Vera|firstfirst1=Dania|last2=Bolufe-Rohler|first2=Antonio|last3=Chen|first3=Stephen|date=2016|title=Estimation multivariate normal algorithm with thresheld convergence|journal=2016 IEEE Congress on Evolutionary Computation (CEC)|language=en-US|publisher=IEEE|doi=10.1109/cec.2016.7744223|isbn=9781509006236|s2cid=33114730 }}</ref>
*Dependency Structure Matrix Genetic Algorithm (DSMGA)<ref>{{Citation|lastlast1=Yu|firstfirst1=Tian-Li|title=Genetic Algorithm Design Inspired by Organizational Theory: Pilot Study of a Dependency Structure Matrix Driven Genetic Algorithm|date=2003|work=Genetic and Evolutionary Computation — GECCO 2003|pages=1620–1621|publisher=Springer Berlin Heidelberg|language=en|doi=10.1007/3-540-45110-2_54|isbn=9783540406037|last2=Goldberg|first2=David E.|last3=Yassine|first3=Ali|last4=Chen|first4=Ying-Ping}}</ref><ref>{{Cite book|lastlast1=Hsu|firstfirst1=Shih-Huan|last2=Yu|first2=Tian-Li|date=2015-07-11|title=Optimization by Pairwise Linkage Detection, Incremental Linkage Set, and Restricted / Back Mixing: DSMGA-II|publisher=ACM|pages=519–526|doi=10.1145/2739480.2754737|isbn=9781450334723|arxiv=1807.11669|s2cid=17031156 }}</ref>
==Related==
|