Sparse distributed memory: Difference between revisions

Content deleted Content added
See also: rm more citations which shouldn't be here, but rather in this article or the other
promote another sa into article body
Line 166:
 
==Probabilistic interpretation==
An [[associative memory (psychology)|associative memory]] system using [[Hierarchical temporal memory#Sparse distributed representations|sparse, distributed representations]] can be reinterpreted as an [[Importance sampling|importance sampler]], a [[Monte Carlo method|Monte
Carlo]] method of approximating [[Bayesian inference]].<ref>Abbott, Joshua T., Jessica B. Hamrick, and Thomas L. Griffiths. "[https://web.archive.org/web/20170911115555/https://pdfs.semanticscholar.org/7f50/8bb0bf0010884a4be72f2774635514fc58ec.pdf Approximating Bayesian inference with a sparse distributed memory system]." Proceedings of the 35th annual conference of the cognitive science society. 2013.</ref> The SDM can be considered a Monte Carlo approximation to a multidimensional [[conditional probability]] integral. The SDM will produce acceptable responses from a training set when this approximation is valid, that is, when the training set contains sufficient data to provide good estimates of the underlying [[Joint probability distribution|joint probabilities]] and there are enough Monte Carlo samples to obtain an accurate estimate of the integral.<ref>{{cite book|doi=10.1109/ijcnn.1989.118597|chapter=A conditional probability interpretation of Kanerva's sparse distributed memory|title=International Joint Conference on Neural Networks|pages=415–417|volume=1|year=1989|last1=Anderson|s2cid=13935339}}</ref>
 
Line 283:
* [[Memory-prediction framework]]
* [[Neural coding]]
* [[Neural Turing machine]]
* [[Random indexing]]
* [[Self-organizing map]]
Line 288 ⟶ 289:
* [[Semantic memory]]
* [[Semantic network]]
* [[Hierarchical temporal memory#Sparse distributed representations|Sparse distributed representations]]
* [[Neural Turing machine]]
* Stacked [[autoencoder]]s
* [[Vector space model]]