History of natural language processing: Difference between revisions

Content deleted Content added
OAbot (talk | contribs)
m Open access bot: url-access updated in citation with #oabot.
No edit summary
Line 42:
== Neural period ==
[[File:A_development_of_natural_language_processing_tools.png|thumb|Timeline of natural language processing models]]
Neural [[Language Model|language models]] were developed in 1990s. In 1990, the [[Recurrent neural network#Elman networks and Jordan networks|Elman network]], using a [[recurrent neural network]], encoded each word in a training set as a vector, called a [[word embedding]], and the whole vocabulary as a [[vector database]], allowing it to perform such tasks as sequence-predictions that are beyond the power of a simple [[multilayer perceptron]]. A shortcoming of the static embeddings was that they didn't differentiate between multiple meanings of [[Homonym|homonyms]].<ref name="1990_ElmanPaper">{{cite journal |last=Elman |first=Jeffrey L. |date=March 1990 |title=Finding Structure in Time |url=http://doi.wiley.com/10.1207/s15516709cog1402_1 |journal=Cognitive Science |volume=14 |issue=2 |pages=179–211 |doi=10.1207/s15516709cog1402_1 |s2cid=2763403|url-access=subscription }}</ref>
 
Yoshua Bengio developed the first neural probabilistic language model in 2000 <ref>{{Citation
| last = Bengio
| first = Yoshua
| author-link = Yoshua Bengio
| title = A Neural Probabilistic Language Model
| place = Montreal, Canada
| publisher = Journal of Machine Learning Research
| series = —
| volume = 3
| edition = —
| date = 2003
| page = 1137–1155
| doi = 10.1162/153244303322533223
}}</ref>
 
==Software==