Hyperdimensional computing: Difference between revisions

Content deleted Content added
Fixed last edit
Tags: Mobile edit Mobile web edit
 
(43 intermediate revisions by 27 users not shown)
Line 1:
{{Short description|Computational approach}}
{{orphan|date=April 2023}}
'''Hyperdimensional computing''' ('''HDC''') is an approach to computation, particularly [[artificial general intelligence|Artificial General Intelligence]],. whereHDC is motivated by the observation that the [[Cerebellum|cerebellum ]] operates on high-dimensional data representations.<ref>{{Citation |last1=Zou |first1=Zhuowen |title=Spiking Hyperdimensional Network: Neuromorphic Models Integrated with Memory-Inspired Framework |date=2021-10-01 |arxiv=2110.00214 |last2=Alimohamadi |first2=Haleh |last3=Imani |first3=Farhad |last4=Kim |first4=Yeseong |last5=Imani |first5=Mohsen}}</ref> In HDC, information is thereby represented as a hyperdimensional (long) [[Vector (mathematics and physics)|vector]], ancalled arraya of numbershypervector. A hyperdimensional vector (hypervector) could include thousands of numbers that represent a point in a space of thousands of dimensions.,<ref name=":0">{{Cite web |last=Ananthaswamy |first=Anan |date=April 13, 2023 |title=A New Approach to Computation Reimagines Artificial Intelligence |url=https://www.quantamagazine.org/a-new-approach-to-computation-reimagines-artificial-intelligence-20230413/?mc_cid=ad9a93c472&mc_eid=506130a407 |website=Quanta Magazine}}</ref> Vectoras vector Symbolicsymbolic Architecturesarchitectures is an older name for the same broad approach.<ref name=":0"Research />extenuates for creating [[Artificial general intelligence|Artificial General Intelligence]].
 
'''Hyperdimensional computing''' (HDC) is an approach to computation, particularly [[artificial intelligence]], where information is represented as a hyperdimensional (long) [[Vector (mathematics and physics)|vector]], an array of numbers. A hyperdimensional vector (hypervector) could include thousands of numbers that represent a point in a space of thousands of dimensions.<ref name=":0">{{Cite web |last=Ananthaswamy |first=Anan |date=April 13, 2023 |title=A New Approach to Computation Reimagines Artificial Intelligence |url=https://www.quantamagazine.org/a-new-approach-to-computation-reimagines-artificial-intelligence-20230413/?mc_cid=ad9a93c472&mc_eid=506130a407 |website=Quanta Magazine}}</ref> Vector Symbolic Architectures is an older name for the same broad approach.<ref name=":0" />
 
{{Toclimit}}
 
== Process ==
Data is mapped from the input space to sparse HD space under an encoding function φ : X → H. HD representations are stored in data structures that are subject to corruption by noise/hardware failures. Noisy/corrupted HD representations can still serve as input for learning, classification, etc. They can also be decoded to recover the input data. H is typically restricted to range-limited integers (-v-v)<ref name=":1">{{Cite journal |lastlast1=Thomas |firstfirst1=Anthony |last2=Dasgupta |first2=Sanjoy |last3=Rosing |first3=Tajana |date=2021-10-05 |title=A Theoretical Perspective on Hyperdimensional Computing |url=https://redwood.berkeley.edu/wp-content/uploads/2021/08/Thomas2021.pdf |journal=Journal of Artificial Intelligence Research |language=en |volume=72 |pages=215–249 |doi=10.1613/jair.1.12664 |s2cid=239007517 |issn=1076-9757}}</ref>
 
This is analogous to the learning process conducted by [[Drosophila|fruit flies]] olfactory system. The input is a roughly 50-dimensional vector corresponding to odor receptor neuron types. The HD representation uses ~2,000-dimensions.<ref name=":1" />
 
== Transparency ==
HDC algebra reveals the logic of how and why systems makes decisions, unlike ANNs[[artificial neural network]]s. Physical world objects can be mapped to hypervectors, to be processed by the algebra.<ref name=":0" />
 
== Performance ==
HDC is suitable for “in"in-memory computing systems", which compute and hold data on a single chip, avoiding data transfer delays. Analog devices operate at low voltages. They are energy-efficient, but prone to error-generating noise. HDC's can tolerate such errors.<ref name=":0" />
 
Various teams have developed low-power HDC hardware accelerators.<ref name=":1" />
 
Nanoscale [[Memristor|memristive]] devices can be exploited to perform computation. An in-memory hyperdimensional computing system can implement operations on two memristive crossbar engines together with peripheral digital [[CMOS]] circuits. Experiments using 760,000 phase-change memory devices performing analog in-memory computing achieved accuracy comparable to software implementations.<ref name=":2">{{Cite journal |lastlast1=Karunaratne |firstfirst1=Geethan |last2=Le Gallo |first2=Manuel |last3=Cherubini |first3=Giovanni |last4=Benini |first4=Luca |last5=Rahimi |first5=Abbas |last6=Sebastian |first6=Abu |date=June 2020 |title=In-memory hyperdimensional computing |url=https://www.nature.com/articles/s41928-020-0410-3 |journal=Nature Electronics |language=en |volume=3 |issue=6 |pages=327–337 |doi=10.1038/s41928-020-0410-3 |arxiv=1906.01548 |s2cid=174797921 |issn=2520-1131}}</ref>
 
== Errors ==
Line 32 ⟶ 30:
 
== Operations ==
HDC can combine hypervectors into new hypervectors using well-defined [[vector space]] operations.
 
[[Group (mathematics)|Groups]], [[Ring (mathematics)|rings]], and [[Field (mathematics)|fields]] over hypervectors become the underlying computing structures with addition, multiplication, permutation, mapping, and inverse as primitive computing operations.<ref name=":2" /> All computational tasks are performed in high-dimensional space using simple operations like element-wise additions and [[dot product]]s.<ref name=":1" />
 
Binding creates ordered point tuples and is also a function ⊗ : H × H → H. The input is two points in {{Var|H}}, while the output is a dissimilar point. Multiplying the SHAPE vector with CIRCLE ''binds'' the two, representing the idea “SHAPE is CIRCLE”. This vector is "nearly orthogonal" to SHAPE and CIRCLE. The components are recoverable from the vector (e.g., answer the question "is the shape a circle?").<ref name=":1" />
 
Addition creates a vector that combines concepts. For example, adding “SHAPE is CIRCLE” to “COLOR is RED,” creates a vector that represents a red circle.
Line 45 ⟶ 43:
 
== History ==
Vector symbolic architectures (VSA) provided a systematic approach to high-dimensional symbol representations to support operations such as establishing relationships. Early examples include holographic reduced representations, binary spatter codes, and matrix binding of additive terms. HD computing advanced these models, particularly emphasizing hardware efficiency.<ref name=":1" />
 
In 2018, Eric Weiss showed how to fully represent an image as a hypervector. A vecorvector could contain information about all the objects in the image, including properties such as color, position, and size.<ref name=":0" />
 
In 2023, Abbas Rahimi et al., used HDC with neural networks to solve [[Raven's Progressive Matrices|Raven's progressive matrices]].<ref name=":0" />
 
In 2023, Mike Heddes et Al. under the supervision of Professors Givargis, Nicolau and Veidenbaum created a [https://torchhd.readthedocs.io/en/stable/index.html# hyper-dimensional computing library]<ref>{{Cite arXiv|last1=Heddes |first1=Mike |last2=Nunes |first2=Igor |last3=Vergés |first3=Pere |last4=Kleyko |first4=Denis |last5=Abraham |first5=Danny |last6=Givargis |first6=Tony |last7=Nicolau |first7=Alexandru |last8=Veidenbaum |first8=Alexander |date=2022-05-18 |title=Torchhd: An Open Source Python Library to Support Research on Hyperdimensional Computing and Vector Symbolic Architectures |class=cs.LG |language=en |eprint=2205.09208}}</ref> that is built on top of [[PyTorch]].
 
== Applications ==
Line 67:
Hypervectors can also be used for reasoning. Raven's progressive matrices presents images of objects in a grid. One position in the grid is blank. The test is to choose from candidate images the one that best fits.<ref name=":0" />
 
A dictionary of hypervectors represents individual objects. Each hypervector represents an object concept with its attributes. For each test image a neural network generates a binary hypervector (valusvalues are +1 or −1) that is as close as possible to some set of dictionary hypervectors. The generated hypervector thus describes all the objects and their attributes in the image.<ref name=":0" />
 
Another algorithm creates probability distributions for the number of objects in each image and their characteristics. These probability distributions describe the likely characteristics of both the context and candidate images. They too are transformed into hypervectors, then algebra predicts the most likely candidate image to fill the slot.<ref name=":0" />
Line 74:
 
=== Other ===
OterOther applications include bio-signal processing, natural language processing, and robotics.<ref name=":1" />
 
== See also ==
 
* [[Support vector machine]]
 
== References ==
{{Reflist}}<references responsive="1"></references>
 
<references group="" responsive="1"></references>
* {{Cite journal |last1=Kleyko |first1=Denis |last2=Rachkovskij |first2=Dmitri A. |last3=Osipov |first3=Evgeny |last4=Rahimi |first4=Abbas |date=2023-07-31 |title=A Survey on Hyperdimensional Computing aka Vector Symbolic Architectures, Part I: Models and Data Transformations |url=https://dl.acm.org/doi/10.1145/3538531 |journal=ACM Computing Surveys |language=en |volume=55 |issue=6 |pages=1–40 |doi=10.1145/3538531 |issn=0360-0300|arxiv=2111.06077 }}
* {{Cite journal |last1=Kleyko |first1=Denis |last2=Rachkovskij |first2=Dmitri |last3=Osipov |first3=Evgeny |last4=Rahimi |first4=Abbas |date=2023-09-30 |title=A Survey on Hyperdimensional Computing aka Vector Symbolic Architectures, Part II: Applications, Cognitive Models, and Challenges |url=https://dl.acm.org/doi/10.1145/3558000 |journal=ACM Computing Surveys |language=en |volume=55 |issue=9 |pages=1–52 |doi=10.1145/3558000 |issn=0360-0300|arxiv=2112.15424 }}
 
== External links ==
 
* {{Citation | vauthors=((Stock, M.)), ((Van Criekinge, W.)), ((Boeckaerts, D.)), ((Taelman, S.)), ((Van Haeverbeke, M.)), ((Dewulf, P.)), ((De Baets, B.)) | veditors=((Dutt, V.)) | year=2024 | title=Hyperdimensional computing: a fast, robust, and interpretable paradigm for biological data | publisher=Public Library of Science (PLOS) | journal = PLOS Computational Biology| volume=20 | issue=9 | pages=e1012426 | doi=10.1371/journal.pcbi.1012426 | doi-access=free | pmid=39316621 | arxiv=2402.17572 }}
* {{Cite web |title=HD/VSA |url=https://www.hd-computing.com/ |access-date=2023-04-15 |website=www.hd-computing.com |language=en-US}}
 
* {{Cite journal |last=Neubert |first=Peer |last2=Schubert |first2=Stefan |last3=Protzel |first3=Peter |date=2019-12-01 |title=An Introduction to Hyperdimensional Computing for Robotics |url=https://doi.org/10.1007/s13218-019-00623-z |journal=KI - Künstliche Intelligenz |language=en |volume=33 |issue=4 |pages=319–330 |doi=10.1007/s13218-019-00623-z |issn=1610-1987}}
* {{Cite webCitation |last=Neubert |firstvauthors=Peer((Cumbo, |last2=SchubertF.)), |first2=Stefan((Chicco, D.)) |date year=2021-01-192025 | title=Hyperdimensional computing asin biomedical sciences: a frameworkbrief forreview| systematicvolume aggregation= of11 image| descriptorsissue = e2885 |url journal =https://arxiv PeerJ Computer Science | pages=e2885 | doi=10.org7717/abs/2101peerj-cs.07720v12885 | doi-access-date=2023-04-15free |website pmc=arXiv.org12192801 |language=en}}
 
* {{Cite journal |last=Kanerva |first=Pentti |date=2009-06-01 |title=Hyperdimensional Computing: An Introduction to Computing in Distributed Representation with High-Dimensional Random Vectors |url=https://doi.org/10.1007/s12559-009-9009-8 |journal=Cognitive Computation |language=en |volume=1 |issue=2 |pages=139–159 |doi=10.1007/s12559-009-9009-8 |s2cid=733980 |issn=1866-9964|url-access=subscription }}
 
* {{Cite journal |lastlast1=Neubert |firstfirst1=Peer |last2=Schubert |first2=Stefan |last3=Protzel |first3=Peter |date=2019-12-01 |title=An Introduction to Hyperdimensional Computing for Robotics |url=https://doi.org/10.1007/s13218-019-00623-z |journal=KI - Künstliche Intelligenz |language=en |volume=33 |issue=4 |pages=319–330 |doi=10.1007/s13218-019-00623-z |s2cid=202642163 |issn=1610-1987|url-access=subscription }}
 
* {{Cite arXiv |last1=Neubert |first1=Peer |last2=Schubert |first2=Stefan |date=2021-01-19 |title=Hyperdimensional computing as a framework for systematic aggregation of image descriptors |class=cs.CV |eprint=2101.07720v1 |language=en}}
 
* {{cite web
| url = https://michielstock.github.io/posts/2022/2022-10-04-HDVtutorial/
| title = Tutorial on Hyperdimensional Computing
| last = Stock
| first = Michiel
| date = 2022-10-04
| website =
| publisher =
| access-date = 2023-07-29
| quote = }}
 
* {{Cite web |title=HD/VSA |url=https://www.hd-computing.com/ |access-date=2023-04-15 |website=www.hd-computing.com |language=en-US | date = 2023-03-13}}
 
* {{Cite magazine
|last=Ananthaswamy
|first=Anil
|title=A New Approach to Computation Reimagines Artificial Intelligence
|language=en-US
|magazine=Quanta Magazine
|url=https://www.quantamagazine.org/a-new-approach-to-computation-reimagines-artificial-intelligence-20230413/
|access-date=2023-06-13
|date = 2023-04-13}}
 
[[Category:Artificial neural networks]]