Rescuing 1 sources and tagging 0 as dead.) #IABot (v2.0.9.5
(6 intermediate revisions by 4 users not shown)
Line 1:
'''Variable-order Bayesian network (VOBN)''' models provide an important extension of both the [[Bayesian network]] models and the [[variable-order Markov models]]. VOBN models are used in [[machine learning]] in general and have shown great potential in [[bioinformatics]] applications.<ref name="Ben-Gal">{{cite journal|last = Ben-Gal|first = I.|coauthors |author2=Shani A., |author3=Gohr A., |author4=Grau J., |author5=Arviv S., |author6=Shmilovici A., |author7=Posch S. and |author8=Grosse I.|title = Identification of Transcription Factor Binding Sites with Variable-order Bayesian Networks|journal = Bioinformatics|volume = 21|issue = 11|date = 2005|pages = 2657–2666|url = http://bioinformatics.oxfordjournals.org/cgi/reprint/bti410?ijkey=KkxNhRdTSfvtvXY&keytype=ref|doi = 10.1093/bioinformatics/bti410|pmid = 15797905|doi-access = |url-access = subscription}}</ref><ref name="Grau">{{cite journal|last = Grau|first = J.|author2 = Ben-Gal I.|author3 = Posch S.|author4 = Grosse I.|title = VOMBAT: Prediction of Transcription Factor Binding Sites using Variable Order Bayesian Trees|journal = Nucleic Acids Research|volume = 34|date = 2006|pages = 529–533|url = http://www.eng.tau.ac.il/~bengal/VOMBAT.pdf|doi = 10.1093/nar/gkl212|pmid = 16845064|issue = Web Server issue|pmc = 1538886|archive-date = 2018-09-30|access-date = 2007-05-14|archive-url = https://web.archive.org/web/20180930084306/http://www.eng.tau.ac.il/~bengal/VOMBAT.pdf|url-status = dead}}</ref>
These models extend the widely used [[position weight matrix]] (PWM) models, [[Markov model]]s, and Bayesian network (BN) models.
In contrast to the BN models, where each random variable depends on a fixed subset of random variables, in VOBN models these subsets may vary based on the specific realization of observed variables. The observed realizations are often called the context and, hence, VOBN models are also known as context-specific Bayesian networks.<ref name=" Boutilier ">{{cite journal|lastconference |title=Context-specific Boutilier|firstindependence in Bayesian networks |last1=Boutilier |first1=C. |author2last2=Friedman |first2=N. |author3author-link2=Nir Friedman |last3=Goldszmidt |first3=M. |author4last4=Koller |first4=D.|author-link4=Daphne Koller |titledate=1996 |publisher= Context|book-specifictitle= independence in Bayesian networks|journalpages=115–123 |___location=Reed InCollege, ProceedingsPortland, ofOregon, theUSA |conference=12th Conference on Uncertainty in Artificial Intelligence|date = (August 1–4, 1996, Reed) College, Portland, Oregon, USA|pages id= 115–123|url = http://www.informatik.uni-trier.de/~ley/db/conf/uai/uai1996.html | arxiv = 1302.3562 }} </ref>
The flexibility in the definition of conditioning subsets of variables turns out to be a real advantage in classification and analysis applications, as the statistical dependencies between random variables in a sequence of variables (not necessarily adjacent) may be taken into account efficiently, and in a position-specific and context-specific manner.