AnA '''word n-gram language model''' is a [[language model]] that modelsgenerates sequencesprobabilities of words as a [[Markovseries process]].of Itwords, makesbased useon ofan the simplifying(over-simplified) assumption that the probability of the next word in a sequence depends only on a fixed size window of previous words. AIf bigram model considersonly one previous word is considered, it is called a trigrambigram model; considersif two words, anda intrigram general,model; anif ''n''-gram1 modelwords, considersan ''n''-1gram words of previous contextmodel.<ref name=jm/>
For example, a bigram language model modelswould assign the probabilityprobabilities of words in the sentence ''I saw the red house'' as: