Variable-order Markov model: Difference between revisions

Content deleted Content added
Yensaa (talk | contribs)
m completed a ref
Citation bot (talk | contribs)
Alter: url. URLs might have been anonymized. Add: s2cid, authors 1-1. Removed proxy/dead URL that duplicated identifier. Removed access-date with no URL. Removed parameters. Some additions/deletions were parameter name changes. | Use this bot. Report bugs. | Suggested by Abductive | #UCB_webform 1940/3850
Line 2:
In the mathematical theory of [[stochastic processes]], '''variable-order Markov (VOM) models''' are an important class of models that extend the well known [[Markov chain]] models. In contrast to the Markov chain models, where each [[random variable]] in a sequence with a [[Markov property]] depends on a fixed number of random variables, in VOM models this number of conditioning random variables may vary based on the specific observed realization.
 
This realization sequence is often called the ''context''; therefore the VOM models are also called ''context trees''.<ref name="Rissanen">{{cite journal|last = Rissanen|first = J.|title = A Universal Data Compression System|journal = IEEE Transactions on Information Theory|volume = 29|issue = 5|date = Sep 1983|pages = 656–664|doi = 10.1109/TIT.1983.1056741}}</ref> VOM models are nicely rendered by colorized probabilistic suffix trees (PST).<ref name=":0">{{Cite journal|lastlast1=Gabadinho|firstfirst1=Alexis|last2=Ritschard|first2=Gilbert|date=2016|title=Analyzing State Sequences with Probabilistic Suffix Trees: The PST R Package|url=http://www.jstatsoft.org/v72/i03/|journal=Journal of Statistical Software|language=en|volume=72|issue=3|doi=10.18637/jss.v072.i03|s2cid=63681202 |issn=1548-7660}}</ref> The flexibility in the number of conditioning random variables turns out to be of real advantage for many applications, such as [[statistical analysis]], [[Statistical classification|classification]] and [[prediction]].<ref name="Shmilovici">{{cite journal|last = Shmilovici|first = A.|author2=Ben-Gal, I. |title = Using a VOM Model for Reconstructing Potential Coding Regions in EST Sequences|journal = Computational Statistics|volume = 22|issue = 1|year = 2007|pages = 49–69|doi = 10.1007/s00180-007-0021-8| s2cid=2737235 }}</ref><ref name="Begleiter">{{cite journal|last = Begleiter|first = R.|author2 = El-Yaniv, R.|author3 = Yona, G.|title = On Prediction Using Variable Order Markov models|journal = Journal of Artificial Intelligence Research|volume = 22|year = 2004|pages = 385–421|url = http://www.jair.org/media/1491/live-1491-2335-jair.pdf|doi = 10.1613/jair.1491|access-date = 2007-04-22|archive-url = https://web.archive.org/web/20070928175244/http://www.jair.org/media/1491/live-1491-2335-jair.pdf|archive-date = 2007-09-28|url-status = dead|doi-access = free}}</ref><ref name="Ben-Gal">{{cite journal|last=Ben-Gal|first=I.|author2=Morag, G.|author3=Shmilovici, A.|year=2003|title=Context-Based Statistical Process Control: A Monitoring Procedure for State-Dependent Processes|url=http://www.eng.tau.ac.il/~bengal/Technometrics_final.pdf|journal=Technometrics|volume=45|issue=4|pages=293–311|doi=10.1198/004017003000000122|s2cid=5227793 |issn=0040-1706}}</ref>
 
==Example==
Line 35:
Various efficient algorithms have been devised for estimating the parameters of the VOM model.<ref name="Begleiter"/>
 
VOM models have been successfully applied to areas such as [[machine learning]], [[information theory]] and [[bioinformatics]], including specific applications such as [[code|coding]] and [[data compression]],<ref name="Rissanen"/> document compression,<ref name="Begleiter"/> classification and identification of [[DNA]] and [[protein|protein sequences]],<ref>{{cite journal |url= http://www.eng.tau.ac.il/~bengal/VOMBAT.pdf|title= VOMBAT: Prediction of Transcription Factor Binding Sites using Variable Order Bayesian Trees |author1=Grau J. |author2=Ben-Gal I. |author3=Posch S. |author4=Grosse I. |publisher= Nucleic Acids Research, vol. 34, issue W529–W533.|year=2006}}</ref> [http://www.eng.tau.ac.il/~bengal/VOMBAT.pdf]<ref name="Shmilovici"/> [[statistical process control]],<ref name="Ben-Gal"/> [[spam filtering]],<ref name="Bratko">{{cite journal|last = Bratko|first = A. |author2=Cormack, G. V. |author3=Filipic, B. |author4=Lynam, T. |author5=Zupan, B.|title = Spam Filtering Using Statistical Data Compression Models|journal = Journal of Machine Learning Research|volume = 7|year = 2006|pages = 2673–2698|url = http://www.jmlr.org/papers/volume7/bratko06a/bratko06a.pdf}}</ref> [[haplotyping]],<ref>[[Sharon R. Browning|Browning, Sharon R.]] "Multilocus association mapping using variable-length Markov chains." The American Journal of Human Genetics 78.6 (2006): 903–913.</ref> speech recognition,<ref>{{Cite journal|lastlast1=Smith|firstfirst1=A.|last2=Denenberg|first2=J.|last3=Slack|first3=T.|last4=Tan|first4=C.|last5=Wohlford|first5=R.|date=1985|title=Application of a sequential pattern learning system to connected speech recognition|url=httphttps://ieeexplore.ieee.org/document/1168282/|journal=ICASSP '85. IEEE International Conference on Acoustics, Speech, and Signal Processing|___location=Tampa, FL, USA|publisher=Institute of Electrical and Electronics Engineers|volume=10|pages=1201–1204|doi=10.1109/ICASSP.1985.1168282|s2cid=60991068 }}</ref> [[sequence analysis in social sciences]],<ref name=":0" /> and others.
 
==See also==