Forward–backward algorithm: Difference between revisions

Content deleted Content added
Add category
Bender the Bot (talk | contribs)
m External links: HTTP to HTTPS for Brown University
 
(2 intermediate revisions by 2 users not shown)
Line 24:
 
==Forward probabilities==
The following description will use matrices of probability values ratherinstead thanof probability distributions. However, althoughit inis generalimportant to note that the forward-backward algorithm can generally be applied to both continuous as well asand discrete probability models.
 
We transform the probability distributions related to a given [[hidden Markov model]] into matrix notation as follows.
Line 73:
</math>
 
This value is the forward unnormalized [[probability vector]]. The i'th entry of this vector provides:
 
:<math>
Line 115:
</math>
 
Notice that we are now using a [[Row and column vectors|column vector]] while the forward probabilities used row vectors. We can then work backwards using:
 
:<math>
Line 191:
</math>
 
Notice that the [[transformation matrix]] is also transposed, but in our example the transpose is equal to the original matrix. Performing these calculations and normalizing the results provides:
 
:<math>
Line 428:
==External links ==
* [http://www.cs.jhu.edu/~jason/papers/#eisner-2002-tnlp An interactive spreadsheet for teaching the forward–backward algorithm] (spreadsheet and article with step-by-step walk-through)
* [httphttps://www.cs.brown.edu/research/ai/dynamics/tutorial/Documents/HiddenMarkovModels.html Tutorial of hidden Markov models including the forward–backward algorithm]
* [http://code.google.com/p/aima-java/ Collection of AI algorithms implemented in Java] (including HMM and the forward–backward algorithm)