Bidirectional recurrent neural networks: Difference between revisions

Content deleted Content added
Rm, this is the English language Wikipedia, remove Spanish website link
Clarified definition and put it ahead of history.
Line 1:
{{Orphan|date=March 2016}}
 
'''Bidirectional Recurrent Neural Networks''' ('''BRNN''') wereconnects inventedtwo hidden layers of opposite directions to the same output. With this form of [[Generative model|generative deep learning]], the output layer can get information from past (backwards) and future (forward) states simultaneously.<ref>{{Cite web|url=https://deepai.org/machine-learning-glossary-and-terms/bidirectional-recurrent-neural-networks|title=What are Bidirectional Recurrent Neural Networks?|last=|first=|date=|website=deepai.org|archive-url=|archive-date=|dead-url=|access-date=}}</ref> Invented in 1997 by Schuster and Paliwal.,<ref name="Schuster">Schuster, Mike, and Kuldip K. Paliwal. "Bidirectional recurrent neural networks." Signal Processing, IEEE Transactions on 45.11 (1997): 2673-2681.2. Awni Hannun, Carl Case, Jared Casper, Bryan Catanzaro, Greg Diamos, Erich Elsen, Ryan</ref> BRNNs were introduced to increase the amount of input information available to the network. For example, [[multilayer perceptron]] (MLPs) and [[time delay neural network]] (TDNNs) have limitations on the input data flexibility, as they require their input data to be fixed. Standard [[recurrent neural network]] (RNNs) also have restrictions as the future input information cannot be reached from the current state. On the contrary, BRNNs do not require their input data to be fixed. Moreover, their future input information is reachable from the current state. The basic idea of BRNNs is to connect two<ref>{{Cite hidden layers of opposite directions to the same outputweb|url=https://arxiv.org/pdf/1801.01078.pdf|title=Recent ByAdvances thisin structure,Recurrent theNeural output layer can get information from past and future states.Networks|last=|first=|date=|website=|archive-url=|archive-date=|dead-url=|access-date=}}</ref>
 
BRNN are especially useful when the context of the input is needed. For example, in handwriting recognition, the performance can be enhanced by knowledge of the letters located before and after the current letter.