Content deleted Content added
→Cost Functions: wikistyle |
m task, replaced: NPJ Quant. Info. → NPJ Quantum Information |
||
Line 3:
'''Quantum neural networks''' are computational [[neural network models]] which are based on the principles of [[quantum mechanics]]. The first ideas on quantum neural computation were published independently in 1995 by [[Subhash Kak]] and Ron Chrisley,<ref>{{cite journal |first=S. |last=Kak |title=On quantum neural computing |journal=Advances in Imaging and Electron Physics |volume=94 |pages=259–313 |year=1995 |doi=10.1016/S1076-5670(08)70147-2 |isbn=9780120147366 }}</ref><ref>{{cite book |first=R. |last=Chrisley |chapter=Quantum Learning |title=New directions in cognitive science: Proceedings of the international symposium, Saariselka, 4–9 August 1995, Lapland, Finland |editor-first=P. |editor-last=Pylkkänen |editor2-first=P. |editor2-last=Pylkkö |publisher=Finnish Association of Artificial Intelligence |___location=Helsinki |pages=77–89 |year=1995 |isbn=951-22-2645-6 }}</ref> engaging with the theory of [[quantum mind]], which posits that quantum effects play a role in cognitive function. However, typical research in quantum neural networks involves combining classical [[artificial neural network]] models (which are widely used in machine learning for the important task of pattern recognition) with the advantages of [[quantum information]] in order to develop more efficient algorithms.<ref>{{cite journal|last1=da Silva|first1=Adenilton J.|last2=Ludermir|first2=Teresa B.|last3=de Oliveira|first3=Wilson R.|year=2016|title=Quantum perceptron over a field and neural network architecture selection in a quantum computer|journal=Neural Networks|volume=76|pages=55–64|arxiv=1602.00709|bibcode=2016arXiv160200709D|doi=10.1016/j.neunet.2016.01.002|pmid=26878722|s2cid=15381014}}</ref><ref>{{cite journal|last1=Panella|first1=Massimo|last2=Martinelli|first2=Giuseppe|year=2011|title=Neural networks with quantum architecture and quantum learning|journal=International Journal of Circuit Theory and Applications|volume=39|pages=61–77|doi=10.1002/cta.619}}</ref><ref>{{cite journal |first1=M. |last1=Schuld |first2=I. |last2=Sinayskiy |first3=F. |last3=Petruccione |arxiv=1408.7005 |title=The quest for a Quantum Neural Network |journal=Quantum Information Processing |volume=13 |issue=11 |pages=2567–2586 |year=2014 |doi=10.1007/s11128-014-0809-8 |bibcode=2014QuIP...13.2567S |s2cid=37238534 }}</ref> One important motivation for these investigations is the difficulty to train classical neural networks, especially in [[Big data|big data applications]]. The hope is that features of [[quantum computing]] such as [[quantum parallelism]] or the effects of [[quantum interference|interference]] and [[Quantum entanglement|entanglement]] can be used as resources. Since the technological implementation of a quantum computer is still in a premature stage, such quantum neural network models are mostly theoretical proposals that await their full implementation in physical experiments.
Most Quantum neural networks are developed as [[Feedforward neural network|feed-forward]] networks. Similar to their classical counterparts, this structure intakes input from one layer of qubits, and passes that input onto another layer of qubits. This layer of qubits evaluates this information and passes on the output to the next layer. Eventually the path leads to the final layer of qubits.<ref name=":0">{{Cite journal|last1=Beer|first1=Kerstin|last2=Bondarenko|first2=Dmytro|last3=Farrelly|first3=Terry|last4=Osborne|first4=Tobias J.|last5=Salzmann|first5=Robert|last6=Scheiermann|first6=Daniel|last7=Wolf|first7=Ramona|date=2020-02-10|title=Training deep quantum neural networks|url= |journal=Nature Communications|language=en|volume=11|issue=1|pages=808|doi=10.1038/s41467-020-14454-2|issn=2041-1723|pmc=7010779|pmid=32041956|arxiv=1902.10445|bibcode=2020NatCo..11..808B}}</ref><ref name="WanDKGK16" /> The layers do not have to be of the same width, meaning they don't have to have the same number of qubits as the layer before or after it. This structure is trained on which path to take similar to classical [[
== Examples ==
Line 15:
=== Quantum networks ===
At a larger scale, researchers have attempted to generalize neural networks to the quantum setting. One way of constructing a quantum neuron is to first generalise classical neurons and then generalising them further to make unitary gates. Interactions between neurons can be controlled quantumly, with unitary gates, or classically, via measurement of the network states. This high-level theoretical technique can be applied broadly, by taking different types of networks and different implementations of quantum neurons, such as [[Integrated quantum photonics|photonically]] implemented neurons<ref name="WanDKGK16">{{cite journal|last1=Wan|first1=Kwok-Ho|last2=Dahlsten|first2=Oscar|last3=Kristjansson|first3=Hler|last4=Gardner|first4=Robert|last5=Kim|first5=Myungshik|year=2017|title=Quantum generalisation of feedforward neural networks|journal=NPJ Quantum Information|volume=3|pages=36|arxiv=1612.01045|bibcode=2017npjQI...3...36W|doi=10.1038/s41534-017-0032-4|s2cid=51685660}}</ref><ref>{{cite journal |first1=A. |last1=Narayanan |first2=T. |last2=Menneer |title=Quantum artificial neural network architectures and components |journal=Information Sciences |volume=128 |issue= 3–4|pages=231–255 |year=2000 |doi=10.1016/S0020-0255(00)00055-4 }}</ref> and [[quantum reservoir processor]].<ref>{{cite journal |last1=Ghosh |first1=S. |last2=Opala |first2=A. |last3=Matuszewski |first3=M. |last4=Paterek |first4=P. |last5=Liew |first5=T. C. H. |doi=10.1038/s41534-019-0149-8 |title=Quantum reservoir processing |journal=NPJ
Quantum neural networks can be applied to algorithmic design: given [[qubits]] with tunable mutual interactions, one can attempt to learn interactions following the classical [[backpropagation]] rule from a [[training set]] of desired input-output relations, taken to be the desired output algorithm's behavior.<ref>{{cite journal |first1=J. |last1=Bang |display-authors=1 |first2=Junghee |last2=Ryu |first3=Seokwon |last3=Yoo |first4=Marcin |last4=Pawłowski |first5=Jinhyoung |last5=Lee |doi=10.1088/1367-2630/16/7/073017 |title=A strategy for quantum algorithm design assisted by machine learning |journal=New Journal of Physics |volume=16 |issue= 7|pages=073017 |year=2014 |arxiv=1301.1132 |bibcode=2014NJPh...16g3017B |s2cid=55377982 }}</ref><ref>{{cite journal |first1=E. C. |last1=Behrman |first2=J. E. |last2=Steck |first3=P. |last3=Kumar |first4=K. A. |last4=Walsh |arxiv=0808.1558 |title=Quantum Algorithm design using dynamic learning |journal=Quantum Information and Computation |volume=8 |issue=1–2 |pages=12–29 |year=2008 }}</ref> The quantum network thus ‘learns’ an algorithm.
Line 28:
== Training ==
Quantum Neural Networks can be theoretically trained similarly to training classical/[[
Using this quantum feed-forward network, deep neural networks can be executed and trained efficiently. A deep neural network is essentially a network with many hidden-layers, as seen in the sample model neural network above. Since the Quantum neural network being discussed utilizes fan-out Unitary operators, and each operator only acts on its respective input, only two layers are used at any given time.<ref name=":0" /> In other words, no Unitary operator is acting on the entire network at any given time, meaning the number of qubits required for a given step depends on the number of inputs in a given layer. Since Quantum Computers are notorious for their ability to run multiple iterations in a short period of time, the efficiency of a quantum neural network is solely dependent on the number of qubits in any given layer, and not on the depth of the network.<ref name=":1" />
|