Cognitive computer: Difference between revisions

Content deleted Content added
Intel Loihi chip: from to 'form'
Line 11:
 
==IBM TrueNorth Chip==
[[File:DARPA SyNAPSE 16 Chip Board.jpg|thumb|[[DARPA]] [[SyNAPSE]] board with 16 TrueNorth chips]]
{{main|TrueNorth}}
TrueNorth was a [[neuromorphic]] [[CMOS]] [[integrated circuit]] produced by [[IBM]] in 2014.<ref>{{Cite journal | doi = 10.1126/science.1254642| title = A million spiking-neuron integrated circuit with a scalable communication network and interface| journal = Science| volume = 345| issue = 6197| pages = 668| year = 2014| last1 = Merolla | first1 = P. A.| last2 = Arthur | first2 = J. V.| last3 = Alvarez-Icaza | first3 = R.| last4 = Cassidy | first4 = A. S.| last5 = Sawada | first5 = J.| last6 = Akopyan | first6 = F.| last7 = Jackson | first7 = B. L.| last8 = Imam | first8 = N.| last9 = Guo | first9 = C.| last10 = Nakamura | first10 = Y.| last11 = Brezzo | first11 = B.| last12 = Vo | first12 = I.| last13 = Esser | first13 = S. K.| last14 = Appuswamy | first14 = R.| last15 = Taba | first15 = B.| last16 = Amir | first16 = A.| last17 = Flickner | first17 = M. D.| last18 = Risk | first18 = W. P.| last19 = Manohar | first19 = R.| last20 = Modha | first20 = D. S. | pmid=25104385}}</ref> It is a [[manycore processor]] [[network on a chip]] design, with 4096 [[multi-core processor|core]]s, each one having 256 programmable simulated [[neurons]] for a total of just over a million neurons. In turn, each neuron has 256 programmable "[[synapses]]" that convey the signals between them. Hence, the total number of programmable synapses is just over 268 million (2<sup>28</sup>). Its basic [[transistor count]] is 5.4 billion. Since memory, computation, and communication are handled in each of the 4096 neurosynaptic cores, TrueNorth circumvents the [[Von Neumann architecture|von-Neumann-architecture]] bottleneck and is very energy-efficient, with IBM claiming a power consumption of 70 [[milliwatts]] and a power density that is 1/10,000th of conventional microprocessors.<ref>http://spectrum.ieee.org/computing/hardware/how-ibm-got-brainlike-efficiency-from-the-truenorth-chip How IBM Got Brainlike Efficiency From the TrueNorth Chip</ref> The [[SyNAPSE]] chip operates at lower temperatures and power because it only draws power necessary for computation.<ref>{{cite web|url=http://research.ibm.com/cognitive-computing/neurosynaptic-chips.shtml|title=Cognitive computing: Neurosynaptic chips|date=11 December 2015|publisher=IBM}}</ref>
The IBM cognitive computers implement learning using [[Hebbian theory]]. Instead of being programmable in a traditional sense within [[machine language]] or a higher level [[programming language]] such a device learns by inputting instances through an [[input device]] that are aggregated within a computational [[convolution]] or [[neural network]] architecture consisting of weights within a parallel memory system. An early instantiation of such a device has been developed in 2012 under the [[Darpa]] [[SyNAPSE]] program at IBM directed by [[Dharmendra Modha]].
In 2017 this IBM 64-chip [[array data structure|array]] will contain the processing equivalent of 64 million neurons and 16 billion synapses, yet each processor consumes just 10 watts of electricity.
Like other neural networks, this system will be put to use in pattern recognition and sensory processing roles. The Air Force wants to combine the [[TrueNorth]] ability to convert multiple data feeds &mdash;&nbsp;whether it's audio, video or text&nbsp;&mdash; into machine readable symbols with a conventional supercomputer's ability to crunch data.
This isn't the first time that IBM's neural chip system has been integrated into cutting-edge technology. In August, 2017 [[Samsung]] installed the chips in its Dynamic Vision Sensors enabling cameras to capture images at up to 2,000 fps while using just 300 milliwatts of power.<ref>https://www.extremetech.com/extreme/233747-samsung-demonstrates-camera-sensors-hooked-to-ibms-brain-imitating-silicon</ref>
 
Google has created three generations of a similar device, a [[Tensor processing unit]] using low resolution, 8 bit computing rather than a [[Spiking neural network]].
 
==Criticism==