Recurrent neural network: Difference between revisions

Content deleted Content added
Line 259:
 
===Second order RNNs===
Second-order RNNs use higher order weights <math>w{}_{ijk}</math> instead of the standard <math>w{}_{ij}</math> weights, and states can be a product. This allows a direct mapping to a [[finite-state machine]] both in training, stability, and representation.<ref>{{cite journal |first1=C. Lee |last1=Giles |first2=Clifford B. |last2=Miller |first3=Dong |last3=Chen |first4=Hsing-Hen |last4=Chen |first5=Guo-Zheng |last5=Sun |first6=Yee-Chun |last6=Lee |url=https://clgiles.ist.psu.edu/pubs/NC1992-recurrent-NN.pdf<!-- https://www.semanticscholar.org/paper/Learning-and-Extracting-Finite-State-Automata-with-Giles-Miller/872cdc269f3cb59f8a227818f35041415091545f --> |title=Learning and Extracting Finite State Automata with Second-Order Recurrent Neural Networks |journal=Neural Computation |volume=4 |issue=3 |pages=393–405 |year=1992 |doi=10.1162/neco.1992.4.3.393 |s2cid=19666035 }}</ref><ref>{{cite journal |first1=Christian W. |last1=Omlin |first2=C. Lee |last2=Giles |title=Constructing Deterministic Finite-State Automata in Recurrent Neural Networks |journal=Journal of the ACM |volume=45 |issue=6 |pages=937–972 |year=1996 |doi=10.1145/235809.235811 |citeseerx=10.1.1.32.2364 |s2cid=228941 }}</ref> Long short-term memory is an example of this but has no such formal mappings or proof of stability.
 
===Hierarchical recurrent neural network===