Content deleted Content added
EditpediaPro (talk | contribs) Removed orphan tag, issue resolved. |
Fixed last edit Tags: Mobile edit Mobile web edit |
||
(42 intermediate revisions by 26 users not shown) | |||
Line 1:
{{Short description|Computational approach}}
'''Hyperdimensional computing''' ('''HDC''') is an approach to computation, particularly [[artificial general intelligence|Artificial General Intelligence]]
{{Toclimit}}
== Process ==
Data is mapped from the input space to sparse HD space under an encoding function φ : X → H. HD representations are stored in data structures that are subject to corruption by noise/hardware failures. Noisy/corrupted HD representations can still serve as input for learning, classification, etc. They can also be decoded to recover the input data. H is typically restricted to range-limited integers (-v-v)<ref name=":1">{{Cite journal |
This is analogous to the learning process conducted by [[Drosophila|fruit flies]] olfactory system. The input is a roughly 50-dimensional vector corresponding to odor receptor neuron types. The HD representation uses ~2,000-dimensions.<ref name=":1" />
== Transparency ==
HDC algebra reveals the logic of how and why systems makes decisions, unlike
== Performance ==
HDC is suitable for
Various teams have developed low-power HDC hardware accelerators.<ref name=":1" />
Nanoscale [[Memristor|memristive]] devices can be exploited to perform computation. An in-memory hyperdimensional computing system can implement operations on two memristive crossbar engines together with peripheral digital [[CMOS]] circuits. Experiments using 760,000 phase-change memory devices performing analog in-memory computing achieved accuracy comparable to software implementations.<ref name=":2">{{Cite journal |
== Errors ==
Line 30:
== Operations ==
HDC can combine hypervectors into new hypervectors using well-defined [[vector space]] operations.
[[Group (mathematics)|Groups]], [[Ring (mathematics)|rings]], and [[Field (mathematics)|fields]] over hypervectors become the underlying computing structures with addition, multiplication, permutation, mapping, and inverse as primitive computing operations.<ref name=":2" /> All computational tasks are performed in high-dimensional space using simple operations like element-wise additions and [[dot product]]s.<ref name=":1" />
Binding creates ordered point tuples and is also a function ⊗ : H × H → H. The input is two points in {{Var|H}}, while the output is a dissimilar point. Multiplying the SHAPE vector with CIRCLE ''binds'' the two, representing the idea “SHAPE is CIRCLE”. This vector is "nearly orthogonal" to SHAPE and CIRCLE. The components are recoverable from the vector (e.g., answer the question "is the shape a circle?").<ref name=":1" />
Addition creates a vector that combines concepts. For example, adding “SHAPE is CIRCLE” to “COLOR is RED,” creates a vector that represents a red circle.
Line 43:
== History ==
Vector symbolic architectures (VSA) provided a systematic approach to high-dimensional symbol representations to support operations such as establishing relationships. Early examples include holographic reduced representations, binary spatter codes, and matrix binding of additive terms. HD computing advanced these models, particularly emphasizing hardware efficiency.<ref name=":1" />
In 2018, Eric Weiss showed how to fully represent an image as a hypervector. A
In 2023, Abbas Rahimi et al., used HDC with neural networks to solve [[Raven's Progressive Matrices|Raven's progressive matrices]].<ref name=":0" />
In 2023, Mike Heddes et Al. under the supervision of Professors Givargis, Nicolau and Veidenbaum created a [https://torchhd.readthedocs.io/en/stable/index.html# hyper-dimensional computing library]<ref>{{Cite arXiv|last1=Heddes |first1=Mike |last2=Nunes |first2=Igor |last3=Vergés |first3=Pere |last4=Kleyko |first4=Denis |last5=Abraham |first5=Danny |last6=Givargis |first6=Tony |last7=Nicolau |first7=Alexandru |last8=Veidenbaum |first8=Alexander |date=2022-05-18 |title=Torchhd: An Open Source Python Library to Support Research on Hyperdimensional Computing and Vector Symbolic Architectures |class=cs.LG |language=en |eprint=2205.09208}}</ref> that is built on top of [[PyTorch]].
== Applications ==
Line 65 ⟶ 67:
Hypervectors can also be used for reasoning. Raven's progressive matrices presents images of objects in a grid. One position in the grid is blank. The test is to choose from candidate images the one that best fits.<ref name=":0" />
A dictionary of hypervectors represents individual objects. Each hypervector represents an object concept with its attributes. For each test image a neural network generates a binary hypervector (
Another algorithm creates probability distributions for the number of objects in each image and their characteristics. These probability distributions describe the likely characteristics of both the context and candidate images. They too are transformed into hypervectors, then algebra predicts the most likely candidate image to fill the slot.<ref name=":0" />
Line 72 ⟶ 74:
=== Other ===
== See also ==
* [[Support vector machine]]
== References ==
{{Reflist}}<references responsive="1"></references>
* {{Cite journal |last1=Kleyko |first1=Denis |last2=Rachkovskij |first2=Dmitri A. |last3=Osipov |first3=Evgeny |last4=Rahimi |first4=Abbas |date=2023-07-31 |title=A Survey on Hyperdimensional Computing aka Vector Symbolic Architectures, Part I: Models and Data Transformations |url=https://dl.acm.org/doi/10.1145/3538531 |journal=ACM Computing Surveys |language=en |volume=55 |issue=6 |pages=1–40 |doi=10.1145/3538531 |issn=0360-0300|arxiv=2111.06077 }}
* {{Cite journal |last1=Kleyko |first1=Denis |last2=Rachkovskij |first2=Dmitri |last3=Osipov |first3=Evgeny |last4=Rahimi |first4=Abbas |date=2023-09-30 |title=A Survey on Hyperdimensional Computing aka Vector Symbolic Architectures, Part II: Applications, Cognitive Models, and Challenges |url=https://dl.acm.org/doi/10.1145/3558000 |journal=ACM Computing Surveys |language=en |volume=55 |issue=9 |pages=1–52 |doi=10.1145/3558000 |issn=0360-0300|arxiv=2112.15424 }}
== External links ==
* {{Citation | vauthors=((Stock, M.)), ((Van Criekinge, W.)), ((Boeckaerts, D.)), ((Taelman, S.)), ((Van Haeverbeke, M.)), ((Dewulf, P.)), ((De Baets, B.)) | veditors=((Dutt, V.)) | year=2024 | title=Hyperdimensional computing: a fast, robust, and interpretable paradigm for biological data | publisher=Public Library of Science (PLOS) | journal = PLOS Computational Biology| volume=20 | issue=9 | pages=e1012426 | doi=10.1371/journal.pcbi.1012426 | doi-access=free | pmid=39316621 | arxiv=2402.17572 }}
* {{Cite web |title=HD/VSA |url=https://www.hd-computing.com/ |access-date=2023-04-15 |website=www.hd-computing.com |language=en-US}}▼
* {{Cite journal |last=Neubert |first=Peer |last2=Schubert |first2=Stefan |last3=Protzel |first3=Peter |date=2019-12-01 |title=An Introduction to Hyperdimensional Computing for Robotics |url=https://doi.org/10.1007/s13218-019-00623-z |journal=KI - Künstliche Intelligenz |language=en |volume=33 |issue=4 |pages=319–330 |doi=10.1007/s13218-019-00623-z |issn=1610-1987}}▼
* {{
* {{Cite journal |last=Kanerva |first=Pentti |date=2009-06-01 |title=Hyperdimensional Computing: An Introduction to Computing in Distributed Representation with High-Dimensional Random Vectors |url=https://doi.org/10.1007/s12559-009-9009-8 |journal=Cognitive Computation |language=en |volume=1 |issue=2 |pages=139–159 |doi=10.1007/s12559-009-9009-8 |s2cid=733980 |issn=1866-9964|url-access=subscription }}
▲* {{Cite journal |
* {{Cite arXiv |last1=Neubert |first1=Peer |last2=Schubert |first2=Stefan |date=2021-01-19 |title=Hyperdimensional computing as a framework for systematic aggregation of image descriptors |class=cs.CV |eprint=2101.07720v1 |language=en}}
* {{cite web
| url = https://michielstock.github.io/posts/2022/2022-10-04-HDVtutorial/
| title = Tutorial on Hyperdimensional Computing
| last = Stock
| first = Michiel
| date = 2022-10-04
| website =
| publisher =
| access-date = 2023-07-29
| quote = }}
▲* {{Cite web |title=HD/VSA |url=https://www.hd-computing.com/ |access-date=2023-04-15 |website=www.hd-computing.com |language=en-US | date = 2023-03-13}}
* {{Cite magazine
|last=Ananthaswamy
|first=Anil
|title=A New Approach to Computation Reimagines Artificial Intelligence
|language=en-US
|magazine=Quanta Magazine
|url=https://www.quantamagazine.org/a-new-approach-to-computation-reimagines-artificial-intelligence-20230413/
|access-date=2023-06-13
|date = 2023-04-13}}
[[Category:Artificial neural networks]]
|