Content deleted Content added
m Open access bot: doi updated in citation with #oabot. |
Added arXiv link |
||
Line 2:
These models extend the widely used [[position weight matrix]] (PWM) models, [[Markov model]]s, and Bayesian network (BN) models.
In contrast to the BN models, where each random variable depends on a fixed subset of random variables, in VOBN models these subsets may vary based on the specific realization of observed variables. The observed realizations are often called the context and, hence, VOBN models are also known as context-specific Bayesian networks.<ref name=" Boutilier ">{{cite conference |title=Context-specific independence in Bayesian networks |last1=Boutilier |first1=C. |last2=Friedman |first2=N. |author-link2=Nir Friedman |last3=Goldszmidt |first3=M. |last4=Koller |first4=D.|author-link4=Daphne Koller |date=1996 |publisher= |book-title= |pages=115–123 |___location=Reed College, Portland, Oregon, USA |conference=12th Conference on Uncertainty in Artificial Intelligence (August 1–4, 1996) |id= |url = http://www.informatik.uni-trier.de/~ley/db/conf/uai/uai1996.html | arxiv = 1302.3562 }} </ref>
The flexibility in the definition of conditioning subsets of variables turns out to be a real advantage in classification and analysis applications, as the statistical dependencies between random variables in a sequence of variables (not necessarily adjacent) may be taken into account efficiently, and in a position-specific and context-specific manner.
|