Content deleted Content added
LucasBrown (talk | contribs) Adding local short description: "Inference algorithm for hidden Markov models", overriding Wikidata description "hidden Markov model inference algorithm which computes the posterior marginals of all hidden state variables given a sequence of observations, making use of dynamic programming to make only 2 passes: one forward, one backward" |
Add category |
||
Line 319:
Given HMM (just like in [[Viterbi algorithm]]) represented in the [[Python programming language]]:
<syntaxhighlight lang="python">
states = (
end_state =
observations = (
start_probability = {
transition_probability = {
emission_probability = {
</syntaxhighlight>
Line 396:
<syntaxhighlight lang="python">
def example():
return fwd_bkw(
observations, )
</syntaxhighlight>
<syntaxhighlight lang="pycon">
Line 419 ⟶ 421:
== References==
{{reflist}}
* [[Lawrence Rabiner|Lawrence R. Rabiner]], A Tutorial on Hidden Markov Models and Selected Applications in Speech Recognition. ''Proceedings of the [[IEEE]]'', 77 (2), p. 257–286, February 1989. [https://dx.doi.org/10.1109/5.18626 10.1109/5.18626]
* {{cite journal |author=Lawrence R. Rabiner, B. H. Juang|title=An introduction to hidden Markov models|journal=IEEE ASSP Magazine |date=January 1986 |pages=4–15}}
* {{cite book | author = Eugene Charniak|title = Statistical Language Learning|publisher = MIT Press| ___location=Cambridge, Massachusetts|year = 1993|isbn=978-0-262-53141-2}}
* <cite id = RussellNorvig10>{{cite book | author = Stuart Russell and Peter Norvig|title = Artificial Intelligence A Modern Approach 3rd Edition|publisher = Pearson Education/Prentice-Hall|___location = Upper Saddle River, New Jersey|year = 2010|isbn=978-0-13-604259-4}}</cite>
==External links ==
Line 430 ⟶ 432:
{{DEFAULTSORT:Forward-backward algorithm}}
[[Category:Articles with example Python (programming language) code]]
[[Category:Dynamic programming]]
[[Category:Error detection and correction]]
|