Time delay neural network: Difference between revisions

Content deleted Content added
ce
Common libraries: | Altered template type. Add: isbn, pages, date, title, chapter, chapter-url, authors 1-6. Removed or converted URL. | Use this tool. Report bugs. | #UCB_Gadget
Line 60:
* TDNNs can be implemented in virtually all machine-learning frameworks using one-dimensional [[convolutional neural network]]s, due to the equivalence of the methods.
* [[Matlab]]: The neural network toolbox has explicit functionality designed to produce a time delay neural network give the step size of time delays and an optional training function. The default training algorithm is a Supervised Learning back-propagation algorithm that updates filter weights based on the Levenberg-Marquardt optimizations. The function is timedelaynet(delays, hidden_layers, train_fnc) and returns a time-delay neural network architecture that a user can train and provide inputs to.<ref>''"[https://www.mathworks.com/help/deeplearning/time-series-and-dynamic-systems.html Time Series and Dynamic Systems - MATLAB & Simulink]".'' mathworks.com. Retrieved 21 June 2016.</ref>
* The [[Kaldi (software)|Kaldi ASR Toolkit]] has an implementation of TDNNs with several optimizations for speech recognition.<ref>Vijayaditya{{cite Peddinti,book Guoguo|doi=10.1109/ASRU.2015.7404842 Chen, Vimal Manohar, Tom Ko, Daniel Povey, Sanjeev Khudanpur, ''[|chapter-url=http://danielpovey.com/files/2015_asru_aspire.pdf |chapter=JHU ASpIRE system: Robust LVCSR with TDNNsTDNNS, i-vectoriVector Adaptationadaptation and RNN-LMs]'',LMS Proceedings|title=2015 ofIEEE theWorkshop IEEEon Automatic Speech Recognition and Understanding Workshop,(ASRU) |date=2015. |last1=Peddinti |first1=Vijayaditya |last2=Chen |first2=Guoguo |last3=Manohar |first3=Vimal |last4=Ko |first4=Tom |last5=Povey |first5=Daniel |last6=Khudanpur |first6=Sanjeev |pages=539–546 |isbn=978-1-4799-7291-3 }}</ref>
 
== See also ==