Content deleted Content added
m Tracy Von Doom moved page Large memory storage and retrieval neural networks to Large memory storage and retrieval neural network: WP:SINGULAR |
No edit summary |
||
Line 2:
{{primary sources|date=August 2019}}
A '''
A LAMSTAR neural network may serve as a dynamic neural network in spatial or time domains or both. Its speed is provided by [[Hebbian]] link-weights<ref name="book2013a">D. Graupe, "[https://books.google.com/books?id=W6W6CgAAQBAJ&printsec=frontcover#v=onepage&q&f=false Principles of Artificial Neural Networks].3rd Edition", World Scientific Publishers, 2013, pp. 203–274.</ref> that integrate the various and usually different filters (preprocessing functions) into its many layers and to dynamically rank the significance of the various layers and functions relative to a given learning task. This vaguely imitates biological learning that integrates various preprocessors ([[cochlea]], [[retina]], ''etc.'') and cortexes ([[Auditory cortex|auditory]], [[Visual cortex|visual]], ''etc.'') and their various regions. Its deep learning capability is further enhanced by using inhibition, correlation and by its ability to cope with incomplete data, or "lost" neurons or layers even amidst a task. It is fully transparent due to its link weights. The link-weights allow dynamic determination of innovation and redundancy, and facilitate the ranking of layers, of filters or of individual neurons relative to a task.
|