Content deleted Content added
Citation bot (talk | contribs) Add: s2cid. | Use this bot. Report bugs. | Suggested by AManWithNoPlan | #UCB_CommandLine |
A relevant reference was added to the page. Tag: Reverted |
||
Line 2:
{{Distinguish|recursive neural network}}
{{Machine learning|Artificial neural network}}
A '''recurrent neural network''' ('''RNN''') is a class of [[artificial neural network]]s where connections between nodes form a [[directed graph|directed]] or [[Graph (discrete mathematics)|undirected graph]] along a temporal sequence. This allows it to exhibit temporal dynamic behavior. Derived from [[feedforward neural networks]], RNNs can use their internal state (memory) to process variable length sequences of inputs.<ref>{{cite journal |last1=Dixit |first1=P. |last2=Silakari |first2=S. |date=2021 |title=Deep Learning Algorithms for Cybersecurity Applications: A Technological and Status Review |url=https://doi.org/10.1016/j.cosrev.2020.100317 |journal=Computer Science Review |volume=39 |pages= | doi=10.1016/j.cosrev.2020.100317}}</ref><ref>{{Cite journal|last=Dupond|first=Samuel|date=2019|title=<!-- for sure correct title? not found, nor in archive.org (for 2020-02-13), nor Volume correct? 2019 is vol 47-48 and 41 from 2016--> A thorough review on the current advance of neural network structures.|url=https://www.sciencedirect.com/journal/annual-reviews-in-control|journal=Annual Reviews in Control|volume=14|pages=200–230}}</ref><ref>{{Cite journal|date=2018-11-01|title=State-of-the-art in artificial neural network applications: A survey|journal=Heliyon|language=en|volume=4|issue=11|pages=e00938|doi=10.1016/j.heliyon.2018.e00938|issn=2405-8440|doi-access=free|last1=Abiodun|first1=Oludare Isaac|last2=Jantan|first2=Aman|last3=Omolara|first3=Abiodun Esther|last4=Dada|first4=Kemi Victoria|last5=Mohamed|first5=Nachaat Abdelatif|last6=Arshad|first6=Humaira|pmid=30519653|pmc=6260436}}</ref><ref>{{Cite journal|date=2018-12-01|title=Time series forecasting using artificial neural networks methodologies: A systematic review|journal=Future Computing and Informatics Journal|language=en|volume=3|issue=2|pages=334–340|doi=10.1016/j.fcij.2018.10.003|issn=2314-7288|doi-access=free|last1=Tealab|first1=Ahmed}}</ref> This makes them applicable to tasks such as unsegmented, connected [[handwriting recognition]]<ref>{{cite journal |last1=Graves |first1=Alex |author-link1=Alex Graves (computer scientist) |last2=Liwicki |first2=Marcus |last3=Fernandez |first3=Santiago |last4=Bertolami |first4=Roman |last5=Bunke |first5=Horst |last6=Schmidhuber |first6=Jürgen |author-link6=Jürgen Schmidhuber |title=A Novel Connectionist System for Improved Unconstrained Handwriting Recognition |url=http://www.idsia.ch/~juergen/tpami_2008.pdf |journal=IEEE Transactions on Pattern Analysis and Machine Intelligence |volume=31 |issue=5 |pages=855–868 |year=2009 |doi=10.1109/tpami.2008.137 |pmid=19299860 |citeseerx=10.1.1.139.4502 |s2cid=14635907 }}</ref> or [[speech recognition]].<ref name="sak2014">{{Cite web |url=https://research.google.com/pubs/archive/43905.pdf |title=Long Short-Term Memory recurrent neural network architectures for large scale acoustic modeling |last1=Sak |first1=Haşim |last2=Senior |first2=Andrew |last3=Beaufays | first3=Françoise |year=2014 }}</ref><ref name="liwu2015">{{cite arXiv |last1=Li |first1=Xiangang |last2=Wu |first2=Xihong |date=2014-10-15 |title=Constructing Long Short-Term Memory based Deep Recurrent Neural Networks for Large Vocabulary Speech Recognition |eprint=1410.4281 |class=cs.CL }}</ref> Recurrent neural networks are theoretically [[Turing complete]] and can run arbitrary programs to process arbitrary sequences of inputs.<ref>{{cite journal|last1=Hyötyniemi|first1=Heikki|date=1996|title=Turing machines are recurrent neural networks|journal=Proceedings of STeP '96/Publications of the Finnish Artificial Intelligence Society|pages=13–24}}</ref>
The term "recurrent neural network" is used to refer to the class of networks with an [[infinite impulse response]], whereas "[[convolutional neural network]]" refers to the class of [[finite impulse response|finite impulse]] response. Both classes of networks exhibit temporal [[dynamic system|dynamic behavior]].<ref>{{Cite journal |last=Miljanovic |first=Milos |date=Feb–Mar 2012 |title=Comparative analysis of Recurrent and Finite Impulse Response Neural Networks in Time Series Prediction |url=http://www.ijcse.com/docs/INDJCSE12-03-01-028.pdf |journal=Indian Journal of Computer and Engineering |volume=3 |issue=1 }}</ref> A finite impulse recurrent network is a [[directed acyclic graph]] that can be unrolled and replaced with a strictly feedforward neural network, while an infinite impulse recurrent network is a [[directed cyclic graph]] that can not be unrolled.
|