Content deleted Content added
No edit summary |
added Category:Neural networks; removed {{uncategorized}} using HotCat |
||
Line 2:
{{primary sources|date=August 2019}}
A '''large memory storage and retrieval neural
A LAMSTAR neural network may serve as a dynamic neural network in spatial or time domains or both. Its speed is provided by [[Hebbian]] link-weights<ref name="book2013a">D. Graupe, "[https://books.google.com/books?id=W6W6CgAAQBAJ&printsec=frontcover#v=onepage&q&f=false Principles of Artificial Neural Networks].3rd Edition", World Scientific Publishers, 2013, pp. 203–274.</ref> that integrate the various and usually different filters (preprocessing functions) into its many layers and to dynamically rank the significance of the various layers and functions relative to a given learning task. This vaguely imitates biological learning that integrates various preprocessors ([[cochlea]], [[retina]], ''etc.'') and cortexes ([[Auditory cortex|auditory]], [[Visual cortex|visual]], ''etc.'') and their various regions. Its deep learning capability is further enhanced by using inhibition, correlation and by its ability to cope with incomplete data, or "lost" neurons or layers even amidst a task. It is fully transparent due to its link weights. The link-weights allow dynamic determination of innovation and redundancy, and facilitate the ranking of layers, of filters or of individual neurons relative to a task.
Line 17:
== External links ==
[[Category:Neural networks]]
|