Sentence processing: Difference between revisions

Content deleted Content added
Computational modeling: Added content on a model and a reference on one author
Tags: Mobile edit Mobile web edit
Citation bot (talk | contribs)
m Add: year, pages, issue, volume, journal, title, doi, pmid, author pars. 1-1. Removed URL that duplicated unique identifier. Converted bare reference to cite template. Formatted dashes. | You can use this bot yourself. Report bugs here.| Activated by User:Nemo bis | via #UCB_webform
Line 7:
 
==Ambiguity==
Sentence comprehension has to deal with ambiguity<ref>{{cite journal|last=Altmann|first=Gerry|title=Ambiguity in sentence processing|journal=Trends in Cognitive Sciences|date=April 1998|volume=2|issue=4|pages=146–151|doi=10.1016/s1364-6613(98)01153-x|pmid=21227111}}</ref> in spoken and written utterances, for example [[Ambiguity|lexical]], [[Syntactic ambiguity|structural]], and [[semantic ambiguity|semantic ambiguities]]. Ambiguity is ubiquitous, but people usually resolve it so effortlessly that they do not even notice it. For example, the sentence ''[[Time flies like an arrow]]'' has (at least) the interpretations ''Time moves as quickly as an arrow'', ''A special kind of fly, called time fly, likes arrows'' and ''Measure the speed of flies like you would measure the speed of an arrow''. Usually, readers will be aware of only the first interpretation. Educated readers though, spontaneously think about the [[arrow of time]] but inhibit that interpretation because it deviates from the original phrase and the temporal lobe acts as a switch.
Instances of ambiguity can be classified as '''local''' or '''global''' ambiguities. A sentence is globally ambiguous if it has two distinct interpretations. Examples are sentences like ''Someone shot the servant of the actress who was on the balcony'' (was it the servant or the actress who was on the balcony?) or ''The cop chased the criminal with a fast car'' (did the cop or the criminal have a fast car?). Comprehenders may have a preferential interpretation for either of these cases, but syntactically and semantically, neither of the possible interpretations can be ruled out.
Line 61:
 
===Computational modeling===
Computational modeling is another means by which to explore language comprehension. Models, such as those instantiated in [[neural networks]], are particularly useful because they requires theorists to be explicit in their hypotheses and because they can be used to generate accurate predictions for theoretical models that are so complex that they render [[discursive psychology|discursive analysis]] unreliable. A classic example of computational modeling in language research is [[James McClelland (psychologist)|McClelland]] and [[Jeff Elman|Elman's]] [[Trace (psycholinguistics)|TRACE]] model of speech perception.<ref>McClelland, J.L., & Elman, J.L. (1986). The TRACE model of speech perception. Cognitive Psychology, 18, 1-86</ref> A model of sentence processing can be found in Hale (2011)'s 'rational' Generalized Left Corner parser.<ref>Hale,{{Cite J.journal T.| (2011). [https://onlinelibrary.wiley.com/doi/pdf/=10.1111/j.1551-6709.2010.01145.x| title=What a rationalRational parserParser wouldWould do].| journal=Cognitive Science,| volume=35(| issue=3),| 399-443pages=399–443| year=2011| last1=Hale| first1=John T.}}</ref> This model derives garden path effects as well as local coherence phenomena.
 
Another example of computational modeling in language research is Mathieu Guidere model of predictive linguistics .<ref>Guidere, M. (2015). [https://www.amazon.com/linguistique-prédictive-Mathieu-Guidère/dp/2343055122]. This model applies to any linguistic phenomena related to action and emotion. It has been applied to the prediction of suicidal intention expressed online in French and English (See CMHA 4th Congress in Toronto, 23-25 September 2019, Session H6).