Layered hidden Markov model: Difference between revisions

Content deleted Content added
Intro: Minor grammatical corrections, and regularized in-line expressions. Modified reference to N+1; if it consists of N levels, there isn't any N+1.
Background: Idiom and style.
Line 5:
== Background ==
 
ItLHMMs isare sometimes useful to use HMMs in specific structures inbecause orderthey tocan facilitate learning and generalization. For example, even though a fully connected HMM could always be used if enough training data iswere available, it is often useful to constrain the model by not allowing arbitrary state transitions. In the same way it can be beneficial to embed the HMM intoin a greaterlayered structure; which, theoretically, may not be able to solve any other problems than the basic HMM cannot, but can solve some problems more efficiently whenbecause itfewer comestraining todata theare amount of required training dataneeded.
 
== The Layered Hidden Markov Model ==