Word n-gram language model: Difference between revisions

Content deleted Content added
Tags: Mobile edit Mobile app edit Android app edit
top: "language" moved out of focus
Tags: Mobile edit Mobile app edit Android app edit
Line 1:
{{DISPLAYTITLE:word ''n''-gram language model}}
A '''word n-gram language model''' of natural language is a [[language model]] that generates probabilities of a series of words, based on an (over-simplified) assumption that the probability of the next word in a sequence depends only on a fixed size window of previous words. If only one previous word is considered, it is called a bigram model; if two words, a trigram model; if ''n''-1 words, an ''n''-gram model.<ref name=jm/>
 
For example, a bigram language model would assign the probabilities of words in the sentence ''I saw the red house'' as: