Word n-gram language model: Difference between revisions

Content deleted Content added
Quick stub. Will be using this shortly as a merge target for a bunch of content in Language model
Tag: Removed redirect
displaytitle
Line 1:
{{DISPLAYTITLE:''n''-gram language model}}
An '''n-gram language model''' is a [[language model]] that models sequences of words as a [[Markov process]]. It makes use of the simplifying assumption that the probability of the next word in a sequence depends only on a fixed size window of previous words. A bigram model considers one previous word, a trigram model considers two, and in general, an ''n''-gram model considers ''n''-1 words of previous context.<ref name=jm/>