Content deleted Content added
m Open access bot: url-access updated in citation with #oabot. |
rescue deadlink |
||
(2 intermediate revisions by 2 users not shown) | |||
Line 11:
=== Quantum perceptrons ===
A lot of proposals attempt to find a quantum equivalent for the [[perceptron]] unit from which neural nets are constructed. A problem is that nonlinear activation functions do not immediately correspond to the mathematical structure of quantum theory, since a quantum evolution is described by linear operations and leads to probabilistic observation. Ideas to imitate the perceptron activation function with a quantum mechanical formalism reach from special measurements<ref>{{cite journal |first=M. |last=Perus |title=Neural Networks as a basis for quantum associative memory |journal=Neural Network World |volume=10 |issue=6 |pages=1001 |year=2000 |citeseerx=10.1.1.106.4583 }}</ref><ref>{{cite journal |first1=M. |last1=Zak |first2=C. P. |last2=Williams |title=Quantum Neural Nets |journal=International Journal of Theoretical Physics |volume=37 |issue=2 |pages=651–684 |year=1998 |doi=10.1023/A:1026656110699 |s2cid=55783801 }}</ref> to postulating non-linear quantum operators (a mathematical framework that is disputed).<ref>{{Cite journal | doi=10.1006/jcss.2001.1769| title=Quantum Neural Networks| journal=Journal of Computer and System Sciences| volume=63| issue=3| pages=355–383| year=2001| last1=Gupta| first1=Sanjay| last2=Zia| first2=R.K.P.| arxiv=quant-ph/0201144| s2cid=206569020}}</ref><ref>{{cite web |first1=J. |last1=Faber |first2=G. A. |last2=Giraldi |title=Quantum Models for Artificial Neural
=== Quantum networks ===
Line 23:
The first quantum associative memory algorithm was introduced by Dan Ventura and Tony Martinez in 1999.<ref>{{cite book |first1=D. |last1=Ventura |first2=T. |last2=Martinez |title=Artificial Neural Nets and Genetic Algorithms |chapter=A Quantum Associative Memory Based on Grover's Algorithm |chapter-url=https://pdfs.semanticscholar.org/d46f/e04b57b75a7f9c57f25d03d1c56b480ab755.pdf |archive-url=https://web.archive.org/web/20170911115617/https://pdfs.semanticscholar.org/d46f/e04b57b75a7f9c57f25d03d1c56b480ab755.pdf |url-status=dead |archive-date=2017-09-11 |pages=22–27 |year=1999 |doi=10.1007/978-3-7091-6384-9_5 |isbn=978-3-211-83364-3 |s2cid=3258510 }}</ref> The authors do not attempt to translate the structure of artificial neural network models into quantum theory, but propose an algorithm for a [[quantum circuit|circuit-based quantum computer]] that simulates [[associative memory (psychology)|associative memory]]. The memory states (in [[Hopfield neural network]]s saved in the weights of the neural connections) are written into a superposition, and a [[Grover search algorithm|Grover-like quantum search algorithm]] retrieves the memory state closest to a given input. As such, this is not a fully content-addressable memory, since only incomplete patterns can be retrieved.
The first truly content-addressable quantum memory, which can retrieve patterns also from corrupted inputs, was proposed by Carlo A. Trugenberger.<ref>{{Cite journal |last=Trugenberger |first=C. A. |date=2001-07-18 |title=Probabilistic Quantum Memories |url=http://dx.doi.org/10.1103/physrevlett.87.067901 |journal=Physical Review Letters |volume=87 |issue=6 |
Trugenberger,<ref name=":2" /> however, has shown that his probabilistic model of quantum associative memory can be efficiently implemented and re-used multiples times for any polynomial number of stored patterns, a large advantage with respect to classical associative memories.
Line 32:
== Training ==
Quantum Neural Networks can be theoretically trained similarly to training classical/artificial neural networks. A key difference lies in communication between the layers of a neural networks. For classical neural networks, at the end of a given operation, the current [[perceptron]] copies its output to the next layer of perceptron(s) in the network. However, in a quantum neural network, where each perceptron is a qubit, this would violate the [[no-cloning theorem]].<ref name=":0" /><ref>{{Cite book|last1=Nielsen|first1=Michael A
Using this quantum feed-forward network, deep neural networks can be executed and trained efficiently. A deep neural network is essentially a network with many hidden-layers, as seen in the sample model neural network above. Since the Quantum neural network being discussed uses fan-out Unitary operators, and each operator only acts on its respective input, only two layers are used at any given time.<ref name=":0" /> In other words, no Unitary operator is acting on the entire network at any given time, meaning the number of qubits required for a given step depends on the number of inputs in a given layer. Since Quantum Computers are notorious for their ability to run multiple iterations in a short period of time, the efficiency of a quantum neural network is solely dependent on the number of qubits in any given layer, and not on the depth of the network.<ref name=":1" />
|