Content deleted Content added
m revised categories |
m Citation maintenance. You can use this bot yourself! Please report any bugs. |
||
Line 1:
'''Variable-order Markov (VOM) models''' are an important class of models that extend the well known [[Markov chain]] models. In contrast to the Markov chain models, where each random variable in a sequence with a [[Markov property]] depends on a fixed number of [[random variable]]s, in VOM models this number of conditioning random variables may vary based on the specific observed realization.
This realization sequence is often called the ''context''; therefore the VOM models are also called ''context trees'' <ref name="Rissanen">{{cite journal|last = Rissanen|first = J.|title = A Universal Data Compression System|journal = IEEE Transactions on Information Theory|volume = 29|issue = 5|date = Sep 1983|pages =
==Example==
Line 34:
Various efficient algorithms have been devised for estimating the parameters of the VOM model.<ref name="Begleiter"/>
VOM models have been successfully applied to areas such as [[machine learning]], [[information theory]] and [[bioinformatics]], including specific applications such as [[code|coding]] and [[data compression]],<ref name="Rissanen"/> document compression,<ref name="Begleiter"/> classification and identification of [[DNA]] and [[protein|protein sequences]],<ref name="Shmilovici"/> [[statistical process control]],<ref name="Ben-Gal"/> [[spam filtering]]<ref name="Bratko">{{cite journal|last = Bratko|first = A.|coauthors = Cormack, G. V., Filipic, B., Lynam, T. and Zupan, B.|title = Spam Filtering Using Statistical Data Compression Models|journal = Journal of Machine Learning Research|volume = 7|date = 2006|pages =
==See also==
|