Automatic summarization: Difference between revisions

Content deleted Content added
WikiCleanerBot (talk | contribs)
m v2.05b - Bot T20 CW#61 - Fix errors for CW project (Reference before punctuation)
Line 167:
 
===Recent approaches===
Recently the rise of [[Transformer (machine learning model)|Transformer models]] replacing more traditional [[Rnn (software)|RNN]] ([[LSTM]]) have provided a flexibility in the mapping of text sequences to text sequences of a different type, which is well suited to automatic summarization. This includes models such as T5<ref>{{Cite web |title=Exploring Transfer Learning with T5: the Text-To-Text Transfer Transformer |url=http://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html |access-date=2022-04-03 |website=Google AI Blog |language=en}}</ref> and Pegasus<ref>Zhang, J., Zhao, Y., Saleh, M., & Liu, P. (2020, November). Pegasus: Pre-training with extracted gap-sentences for abstractive summarization. In International Conference on Machine Learning (pp. 11328-11339). PMLR.</ref>.
 
==See also==