Content deleted Content added
Parenthetical comment is not necessary, as examples are already given with "Viterbi path" and "Viterbi algorithm". (Plus, it might confuse people not familiar with the metaword "foo".) |
No edit summary |
||
Line 1:
The '''Viterbi algorithm''', named after its developer [[Andrew Viterbi]], is a [[dynamic programming]] [[algorithm]] for finding the most [[likelihood|likely]] sequence of hidden states – known as the '''Viterbi path''' – that result in a sequence of observed events, especially in the context of [[hidden Markov model]]s. The '''forward algorithm''' is a closely related algorithm for computing the probability of a sequence of observed events.
The Viterbi algorithm was originally conceived as an [[error-correction]] scheme for noisy digital communication links, finding universal application in decoding the [[convolutional code]]s used in [[CDMA]] and [[GSM]] digital cellular, dial modems, satellite and deep-space communications, and [[802.11]] wireless LANs. It is now also commonly used in [[information theory]], [[speech recognition]], [[keyword spotting]], [[computational linguistics]], and [[bioinformatics]]. For example, in speech-to-text speech recognition, the acoustic signal is treated as the observed sequence of events, and a string of text is considered to be the "hidden cause" of the acoustic signal. The Viterbi algorithm finds the most likely string of text given the acoustic signal.
The algorithm is not general; it makes a number of assumptions. First, both the observed events and hidden events must be in a sequence. This sequence often corresponds to time. Second, these two sequences need to be aligned, and an observed event needs to correspond to exactly one hidden event. Third, computing the most likely hidden sequence up to a certain point ''t'' must only depend on the observed event at point ''t'', and the most likely sequence at point ''t'' − 1. These assumptions are all satisfied in a first-order hidden Markov model.
Line 67:
This reveals that the total probability of <code>['walk', 'shop', 'clean']</code> is 0.033612 and that the Viterbi path is <code>['Sunny', 'Rainy', 'Rainy', 'Rainy']</code>. The Viterbi path contains four states because the third observation was generated by the third state and a transition to the fourth state. In other words, given the observed activities, it was most likely sunny when your friend went for a walk and then it started to rain the next day and kept on raining.
==Extensions==
With the algorithm called [[Iterative Viterbi Decoding]] one can find the subsequence of an observation that matches best (on average) to a given HMM. [[Iterative Viterbi Decoding]], developed by M.C.Silaghi (1998) works by iterating the call to a modified Viterbi algorithm, reestimating the score for a filler until convergence.
==References==
|