Forward–backward algorithm: Difference between revisions

Content deleted Content added
Adding local short description: "Inference algorithm for hidden Markov models", overriding Wikidata description "hidden Markov model inference algorithm which computes the posterior marginals of all hidden state variables given a sequence of observations, making use of dynamic programming to make only 2 passes: one forward, one backward"
Bender the Bot (talk | contribs)
m External links: HTTP to HTTPS for Brown University
 
(3 intermediate revisions by 3 users not shown)
Line 24:
 
==Forward probabilities==
The following description will use matrices of probability values ratherinstead thanof probability distributions. However, althoughit inis generalimportant to note that the forward-backward algorithm can generally be applied to both continuous as well asand discrete probability models.
 
We transform the probability distributions related to a given [[hidden Markov model]] into matrix notation as follows.
Line 73:
</math>
 
This value is the forward unnormalized [[probability vector]]. The i'th entry of this vector provides:
 
:<math>
Line 115:
</math>
 
Notice that we are now using a [[Row and column vectors|column vector]] while the forward probabilities used row vectors. We can then work backwards using:
 
:<math>
Line 191:
</math>
 
Notice that the [[transformation matrix]] is also transposed, but in our example the transpose is equal to the original matrix. Performing these calculations and normalizing the results provides:
 
:<math>
Line 319:
Given HMM (just like in [[Viterbi algorithm]]) represented in the [[Python programming language]]:
<syntaxhighlight lang="python">
states = ('"Healthy'", '"Fever'")
end_state = '"E'"
 
observations = ('"normal'", '"cold'", '"dizzy'")
 
start_probability = {'"Healthy'": 0.6, '"Fever'": 0.4}
 
transition_probability = {
'Healthy' "Healthy": {'"Healthy'": 0.69, '"Fever'": 0.3, '"E'": 0.01},
'Fever' "Fever": {'"Healthy'": 0.4, '"Fever'": 0.59, '"E'": 0.01},
}
 
emission_probability = {
'Healthy' "Healthy": {'"normal'": 0.5, '"cold'": 0.4, '"dizzy'": 0.1},
'Fever' "Fever": {'"normal'": 0.1, '"cold'": 0.3, '"dizzy'": 0.6},
}
</syntaxhighlight>
 
Line 396:
<syntaxhighlight lang="python">
def example():
return fwd_bkw(
observations,
states,
start_probability,
transition_probability,
emission_probability,
end_state),
)
</syntaxhighlight>
<syntaxhighlight lang="pycon">
Line 419 ⟶ 421:
== References==
{{reflist}}
* [[Lawrence Rabiner|Lawrence R. Rabiner]], A Tutorial on Hidden Markov Models and Selected Applications in Speech Recognition. ''Proceedings of the [[IEEE]]'', 77 (2), p.&nbsp;257–286, February 1989. [https://dx.doi.org/10.1109/5.18626 10.1109/5.18626]
* {{cite journal |author=Lawrence R. Rabiner, B. H. Juang|title=An introduction to hidden Markov models|journal=IEEE ASSP Magazine |date=January 1986 |pages=4–15}}
* {{cite book | author = Eugene Charniak|title = Statistical Language Learning|publisher = MIT Press| ___location=Cambridge, Massachusetts|year = 1993|isbn=978-0-262-53141-2}}
* <cite id = RussellNorvig10>{{cite book | author = Stuart Russell and Peter Norvig|title = Artificial Intelligence A Modern Approach 3rd Edition|publisher = Pearson Education/Prentice-Hall|___location = Upper Saddle River, New Jersey|year = 2010|isbn=978-0-13-604259-4}}</cite>
 
==External links ==
* [http://www.cs.jhu.edu/~jason/papers/#eisner-2002-tnlp An interactive spreadsheet for teaching the forward–backward algorithm] (spreadsheet and article with step-by-step walk-through)
* [httphttps://www.cs.brown.edu/research/ai/dynamics/tutorial/Documents/HiddenMarkovModels.html Tutorial of hidden Markov models including the forward–backward algorithm]
* [http://code.google.com/p/aima-java/ Collection of AI algorithms implemented in Java] (including HMM and the forward–backward algorithm)
 
{{DEFAULTSORT:Forward-backward algorithm}}
[[Category:Articles with example Python (programming language) code]]
[[Category:Dynamic programming]]
[[Category:Error detection and correction]]