Models of neural computation: Difference between revisions

Content deleted Content added
m [Pu408]Add: jstor, issue. Tweak: title, issue. Formatted dashes. You can use this bot yourself. Report bugs here.
No edit summary
 
(73 intermediate revisions by 42 users not shown)
Line 1:
{{orphanUse dmy dates|date=MarchOctober 20102020}}
'''Models of neural computation''' are attempts to elucidate, in an abstract and mathematical fashion, the core principles that underlie information processing in biological nervous systems, or functional components thereof. This article aims to provide an overview of the most definitive models of neuro-biological computation as well as the tools commonly used to construct and analyze them.
 
'''Models of Neural Computation''' are attempts to elucidate, in an abstract and mathematical fashion, the core principles that underlie information processing in biological nervous systems, or functional components thereof. This article aims to provide an overview of the most definitive models of neuro-biological computation as well as the tools commonly used to construct and analyze them.
 
==Introduction==
Due to the complexity of nervous system behavior, the associated experimental error bounds are ill-defined, but the relative merit of the different [[scientific model|models]] of a particular subsystem can be compared according to how closely they reproduce real-world behaviors or respond to specific input signals. In the closely related field of computational [[neuroethology]], the practice is to include the environment in the model in such a way that the [[Control theory#Closed-loop_transfer_functionloop transfer function|loop is closed]]. In the cases where competing models are unavailable, or where only gross responses have been measured or quantified, a clearly formulated model can guide the scientist in designing experiments to probe biochemical mechanisms or network connectivity.
 
In all but the simplest cases, the mathematical equations that form the basis of a model cannot be solved exactly. Nevertheless, computer technology, sometimes in the form of specialized software or hardware architectures, allow scientists to perform iterative calculations and search for plausible solutions. A computer chip or a robot that can interact with the natural environment in ways akin to the original organism is one embodiment of a useful model. The ultimate measure of success is however the ability to make testable predictions.
 
==General Criteriacriteria for Evaluatingevaluating Modelsmodels==
 
===Speed of Information Processing===
===Speed of information processing===
The rate of information processing in biological neural systems are constrained by the speed at which an action potential can propagate down a nerve fibre. This conduction velocity ranges from 1 m/s to over 100 m/s, and generally increases with the diameter of the neuronal process. Slow in the timescales of biologically-relevant events dictated by the speed of sound or the force of gravity, the nervous system overwhelmingly prefers parallel computations over serial ones in time-critical applications.
 
===Robustness===
A model is robust if it continues to produce the same computational results under small variations in inputs or operating parameters introduced by noise. For example, the direction of motion as computed by a robust [[motion perception|motion detector]] would not change under small changes of [[luminance]], [[contrast (vision)|contrast]] or velocity jitter. For simple mathematical models of neuron, for example the dependence of spike patterns on signal delay is much weaker than the dependence on changes in "weights" of interneuronal connections.<ref>{{cite journal |last1=Cejnar |first1=Pavel |last2=Vyšata |first2=Oldřich |last3=Vališ |first3=Martin |last4=Procházka |first4=Aleš |title=The Complex Behaviour of a Simple Neural Oscillator Model in the Human Cortex |journal=IEEE Transactions on Neural Systems and Rehabilitation Engineering |date=2019 |volume=27 |issue=3 |pages=337–347 |doi= 10.1109/TNSRE.2018.2883618 |pmid=30507514|s2cid=54527064 }}</ref>
 
===Gain Controlcontrol===
This refers to the principle that the response of a nervous system should stay within certain bounds even as the inputs from the environment change drastically. For example, when adjusting between a sunny day and a moonless night, the retina changes the relationship between light level and neuronal output by a factor of more than <math>10^6</math> so that the signals sent to later stages of
the visual system always remain within a much narrower range of amplitudes.<ref name=Ferster>{{cite news
| title = A New Mechanism for Neuronal Gain Control
| author author1= Nicholas J. Priebe and |author2=David Ferster
|name-list-style=amp | publisher = Neuron
| year = 2002
| volume = 35
| pages = 602–604
| issue=4}}</ref><ref>Klein, S. A., Carney, T., Barghout-Stein, L., & Tyler, C. W. (1997, June). Seven models of masking. In Electronic Imaging'97 (pp. 13–24). International Society for Optics and Photonics.</ref><ref>Barghout-Stein, Lauren. On differences between peripheral and foveal pattern masking. Diss. University of California, Berkeley, 1999.</ref>
| issue=4}}</ref>.
 
===Linearity versus Nonlinearitynonlinearity===
A '''linear''' system is one whose response in a specified unit of measure, to a set of inputs considered at once, is the sum of its responses due to the inputs considered individually.
 
[[Linear algebra|Linear]] systems are easier to analyze mathematically and are a persuasive assumption in many models including the McCulloch and Pitts neuron, population coding models, and the simple neurons often used in [[Artificial neural network]]s. Linearity may occur in the basic elements of a neural circuit such as the response of a postsynaptic neuron, or as an emergent property of a combination of nonlinear subcircuits.<ref name="MolnarHsueh2009">{{cite journal|last1=Molnar|first1=Alyosha|last2=Hsueh|first2=Hain-Ann|last3=Roska|first3=Botond|last4=Werblin|first4=Frank S.|title=Crossover inhibition in the retina: circuitry that compensates for nonlinear rectifying synaptic transmission|journal=Journal of Computational Neuroscience|volume=27|issue=3|year=2009|pages=569–590|issn=0929-5313|doi=10.1007/s10827-009-0170-6 | pmid = 19636690|pmc=2766457}}</ref> Though linearity is often seen as incorrect, there has been recent work suggesting it may, in fact, be biophysically plausible in some cases.<ref>{{Cite journal|last1=Singh|first1=Chandan|last2=Levy|first2=William B.|date=2017-07-13|title=A consensus layer V pyramidal neuron can sustain interpulse-interval coding|journal=PLOS ONE|volume=12|issue=7|pages=e0180839|doi=10.1371/journal.pone.0180839|pmid=28704450|pmc=5509228|arxiv=1609.08213|bibcode=2017PLoSO..1280839S|issn=1932-6203|doi-access=free}}</ref><ref>{{Cite journal|last1=Cash|first1=Sydney|last2=Yuste|first2=Rafael|date=1998-01-01|title=Input Summation by Cultured Pyramidal Neurons Is Linear and Position-Independent|journal=Journal of Neuroscience|language=en|volume=18|issue=1|pages=10–15|issn=0270-6474|pmid=9412481|doi=10.1523/JNEUROSCI.18-01-00010.1998|pmc=6793421|doi-access=free}}</ref>
[[Linear algebra|Linear]] systems are easier to analyze mathematically. Linearity may occur in the basic elements of a neural circuit such as the response of a postsynaptic neuron, or as an emergent property of a combination of nonlinear subcircuits <ref name=barlow>{{cite journal
| doi = 10.1007/s10827-009-0170-6
| title = Crossover inhibition in the retina: circuitry that compensates for nonlinear rectifying synaptic transmission
| author = Alyosha Molnar, Hain-Ann Hsueh, Botond Roska, Frank S. Werblin,
| journal = J Comput Neurosci
| year = 2009
| volume = 27
| issue = 3
| pages = 569–590
| pmid = 19636690
| pmc = 2766457
}}</ref>.
 
==Examples==
Line 46 ⟶ 35:
A computational neural model may be constrained to the level of biochemical signalling in individual [[neurons]] or it may describe an entire organism in its environment. The examples here are grouped according to their scope.
 
===Models of Informationinformation Transfertransfer in Neuronsneurons===
{{main|biological neuron model}}
The most widely used models of information transfer in biological neurons are based on analogies with electrical circuits. The equations to be solved are time-dependent differential equations with electro-dynamical variables such as Currentcurrent, Conductanceconductance or Resistanceresistance, Capacitancecapacitance and Voltagevoltage.
 
====Hodgkin-HuxleyHodgkin–Huxley Modelmodel and its Derivativesderivatives====
{{main|Hodgkin–Huxley model}}
The Hodgkin–Huxley model, widely regarded as one of the great achievements of 20th-century biophysics, describes how [[action potential]]s in neurons are initiated and propagated in axons via [[voltage-gated ion channel]]s. It is a set of [[nonlinearity|nonlinear]] [[ordinary differential equation]]s that were introduced by [[Alan Lloyd Hodgkin]] and [[Andrew Huxley]] in 1952 to explain the results of [[voltage clamp]] experiments on the [[squid giant axon]]. Analytic solutions do not exist, but the [[Levenberg–Marquardt algorithm]], a modified [[Gauss–Newton algorithm]], is often used to [[curve fitting|fit]] these equations to voltage-clamp data.
 
The [[FitzHugh-NagumoFitzHugh–Nagumo model]] is a simplicationsimplification of the Hodgkin–Huxley model. The [[Hindmarsh-RoseHindmarsh–Rose model]] is an extension which describes neuronal spike bursts. The Morris-LecarMorris–Lecar model is a modification which does not generate spikes, but describes slow-wave propagation, which is implicated in the inhibitory synaptic mechanisms of [[central pattern generator]]s.
 
====Solitons====
{{main|Soliton model in neuroscience}}
{{Empty section|date=January 2011}}
 
The [[Soliton model in neuroscience|soliton model]] is an alternative to the [[Hodgkin–Huxley model]] that claims to explain how [[action potentials]] are initiated and conducted in the form of certain kinds of [[Solitary wave (water waves)|solitary]] [[sound]] (or [[density]]) pulses that can be modeled as [[soliton]]s along [[axon]]s, based on a thermodynamic theory of nerve pulse propagation.
====Transfer Functions and Linear Filters====
 
====Transfer functions and linear filters====
This approach, influenced by [[control theory]] and [[signal processing]], treats neurons and synapses as time-invariant entities that produce outputs that are [[LTI System Theory|linear combinations]] of input signals, often depicted as sine waves with a well-defined temporal or spatial frequencies.
 
The entire behavior of a neuron or synapse are encoded in a [[transfer function]], lack of knowledge concerning the exact underlying mechanism notwithstanding. This brings a highly developed mathematics to bear on the problem of information transfer.
 
The accompanying taxonomy of [[linear filter]]s turns out to be useful in characterizing neural circuitry. Both [[low -pass filter|low-]]s and [[high -pass filtersfilter]]s are postulated to exist in some form in sensory systems, as they act to prevent information loss in high and low contrast environments, respectively.
 
Indeed, measurements of the transfer functions of neurons in the [[horseshoe crab]] retina according to linear systems analysis show that they remove short-term fluctuations in input signals leaving only the long-term trends, in the manner of low-pass filters. These animals are unable to see low-contrast objects without the help of optical distortions caused by underwater currents .<ref name=barlowbarlow1993>{{cite journal
| title = The Neural Network of the Limulus Retina: From Computer to Behavior
| author author1= Robert B. Barlow, Jr., |author2=Ramkrishna Prakash, |author3=Eduardo Solessio | journal = American Zoologist
| journal = American Zoologist
| year = 1993
| volume = 33
| pages = 66–78
| doi=10.1093/icb/33.1.66
| doi-access = free
}}</ref><ref>{{cite journal
| doi = 10.2307/1543311
Line 84 ⟶ 76:
| pmid = 11341579
| jstor = 1543311
| citeseerx = 10.1.1.116.5190
}}</ref>.
| s2cid = 18371282
}}</ref>
 
===Models of Computationscomputations in Sensorysensory Systemssystems ===
 
====Lateral Inhibition in the Retina: Hartline-Ratliff Equations====
====Lateral inhibition in the retina: Hartline–Ratliff equations====
In the retina, an excited neural receptor can suppress the activity of surrounding neurons within an area called the inhibitory field. This effect, known as [[lateral inhibition]], increases the contrast and sharpness in visual response, but leads to the epiphenomenon of [[Mach bands]]. This is often illustrated by the [[illusion|optical illusion]] of light or dark stripes next to a sharp boundary between two regions in an image of different luminance.
 
The Hartline-Ratliff model describes interactions within a group of ''p'' [[photoreceptor cell]]s.<ref name=kuhnhadeler>{{cite journal
| doi = 10.1007/BF00319520
| title = Stationary States of the Hartline-RatliffHartline–Ratliff Model
| author author1= K. P. Hadeler and |author2=D. Kuhn
|name-list-style=amp | journal = Biological Cybernetics
| year = 1987
| volume = 56
| pages = 411–417
| issue = 5–6
| s2cid = 8710876
}}</ref>. Assuming these interactions to be '''linear''', they proposed the following relationship for the '''steady-state response rate''' <math>r_p</math> of the given ''p''-th photoreceptor in terms of the steady-state response rates <math>r_j</math> of the ''j'' surrounding receptors:
}}</ref> Assuming these interactions to be '''linear''', they proposed the following relationship for the '''steady-state response rate''' <math>r_p</math> of the given ''p''-th photoreceptor in terms of the steady-state response rates <math>r_j</math> of the ''j'' surrounding receptors:
 
<math>r_{p}=\left|\left[e_{p}-\sum_{j=1,j\ne p}^{n}k_{pj}\left|r_{j}-r_{pj}^{o}\right|\right]\right|</math>.
Line 111 ⟶ 107:
<math>k_{pj}</math> is the coefficient of inhibitory interaction between the ''p''-th and the ''j''th receptor. The inhibitory interaction decreases with distance from the target ''p''-th receptor.
 
====Cross-Correlationcorrelation in Soundsound Localizationlocalization: Jeffress Modelmodel====
According to [[Lloyd A. Jeffress|Jeffress]],<ref>{{cite journal | last1 = Jeffress, | first1 = L.A., | year = 1948. | title = A place theory of sound localization. ''| journal = Journal of Comparative and Physiological Psychology | volume = 41'', 35-39| issue = 1| pages = 35–39 | doi=10.1037/h0061495 | pmid=18904764}}</ref>, in order to compute the ___location of a sound source in space from [[interaural time difference]]s, an auditory system relies on [[Analog delay line|delay lines]]s: the induced signal from an [[ipsilateral]] auditory receptor to a particular neuron is delayed for the same time as it takes for the original sound to go in space from that ear to the other. Each postsynaptic cell is differently delayed and thus specific for a particular inter-aural time difference. This theory is equivalent to the mathematical procedure of [[cross-correlation]].
 
Following Fischer and Anderson,<ref>{{cite journal | last1 = Fischer | first1 = Brian J. Fischer| last2 = Anderson | first2 and= Charles H. Anderson,| year = 2004. | title = A computational model of sound localization in the barn owl ''| journal = Neurocomputing" | volume = 58–60 (2004)| pages = 1007–1012 | doi=10.1016/j.neucom.2004.01.159| s2cid = 31927198 }}</ref>, the response of the postsynaptic neuron to the signals from the left and right ears is given by
 
<math>y_{R}\left(t\right) - y_{L}\left(t\right)</math>
Line 128 ⟶ 124:
<math>w\left(t-\sigma\right)</math> represents the delay function. This is not entirely correct and a clear eye is needed to put the symbols in order.
 
Structures have been located in the barn owl which are consistent with Jeffress-type mechanisms.<ref>Catherine E. Carr, 1993. Delay Line Models of Sound Localization in the Barn Owl "American Zoologist" Vol. 33, No. 1 79-8579–85</ref>
 
====Cross-Correlationcorrelation for Motionmotion Detectiondetection: Hassenstein-ReichardtHassenstein–Reichardt Modelmodel====
A motion detector needs to satisfy three general requirements: pair-inputs, asymmetry and nonlinearity.<ref>Borst A, Egelhaaf M., 1989. Principles of visual motion detection. "Trends in NeuroscienceNeurosciences" 12(8):297-306297–306</ref> The cross-correlation operation implemented asymmetrically on the responses from a pair of photoreceptors satisfies these minimal criteria, and furthermore, predicts features which have been observed in the response of neurons of the lobula plate in bi-wing insects .<ref>{{cite journal | last1 = Joesch, | first1 = M. et|display-authors=etal al. (| year = 2008) | title = Response properties of motion-sensitive visual interneurons in the lobula plate of Drosophila melanogaster. | journal = Curr. Biol. | volume = 18, | issue = 5| pages = 368–374 | doi=10.1016/j.cub.2008.02.022| pmid = 18328703 | s2cid = 18873331 | doi-access = free | bibcode = 2008CBio...18..368J }}</ref>.
 
The master equation for response is
Line 137 ⟶ 133:
<math>R = A_1(t-\tau)B_2(t) - A_2(t - \tau)B_1(t)</math>
 
The HR model predicts a peaking of the response at a particular input temporal frequency. The conceptually similar Barlow-LevickBarlow–Levick model is deficient in the sense that a stimulus presented to only one receptor of the pair is sufficient to generate a response. This is unlike the HR model, which requires two correlated signals delivered in a time ordered fashion. However the HR model does not show a saturation of response at high contrasts, which is observed in experiment. Extensions of the Barlow-Levick model can provide for this discrepancy.<ref>Gonzalo G. de Polavieja, 2006. Neuronal Algorithms That Detect the Temporal Order of Events "Neural Computation" 18 (2006) 2102–2121</ref>.
 
====Watson-AhumadaWatson–Ahumada Modelmodel for Motionmotion Estimationestimation in Humanshumans====
This uses a cross-correlation in both the spatial and temporal directions, and is related to the concept of [[optical flow]].<ref>Andrew B. Watson and Albert J. Ahumada, Jr., 1985. Model of human visual-motion sensing "J. Opt. Soc. Am. A" 2(2) 322–341</ref>
<ref>Andrew B. Watson and Albert J. Ahumada, Jr., 1985. Model of human visual-motion sensing "J. Opt. Soc. Am. A" 2(2) 322-341</ref>
 
===Anti-Hebbian adaptation: spike-timing dependent plasticity===
===Neurophysiological Metronomes: Neural Circuits for Pattern Generation===
Mutually [[inhibitory]] processes are an unifying motif of all [[central pattern generator]]s. This has been demonstrated in the stomatogastric (STG) nervous system of crayfish and lobsters<ref>Michael P. Nusbaum and Mark P. Beenhakker, A small-systems approach to motor pattern generation, Nature 417, 343-350 (16 May 2002)</ref>. Two and three-cell oscillating networks based on the STG have been constructed which are amenable to mathematical analysis, and which depend in a simple way on on synaptic strengths and overall activity, presumably the knobs on these things.<ref>Cristina Soto-Treviño, Kurt A. Thoroughman and Eve Marder, L. F. Abbott, 2006. Activity-dependent modification of inhibitory synapses in models of rhythmic neural networks Nature Vol 4 No 3 2102–2121</ref>. The mathematics involved is the theory of [[dynamical systems]].
 
* {{cite journal | last1 = Tzounopoulos | first1 = T | last2 = Kim | first2 = Y | last3 = Oertel | first3 = D | last4 = Trussell | first4 = LO | year = 2004 | title = Cell-specific, spike timing-dependent plasticities in the dorsal cochlear nucleus | journal = Nat Neurosci | volume = 7 | issue = 7| pages = 719–725 | doi=10.1038/nn1272| pmid = 15208632 | s2cid = 17774457 }}
===Anti-Hebbian adaptation: Spike-Timing Dependent Plasticity===
* {{cite journal | last1 = Roberts | first1 = Patrick D. | last2 = Portfors | first2 = Christine V. | year = 2008 | title = Design principles of sensory processing in cerebellum-like structures| doi = 10.1007/s00422-008-0217-1 | pmid = 18491162 | journal = Biological Cybernetics | volume = 98 | issue = 6| pages = 491–507 | s2cid = 14393814 }}
<ref>Tzounopoulos T, Kim Y, Oertel D, Trussell LO (2004) Cell-specific, spike timing-dependent plasticities in the dorsal cochlear nucleus. Nat Neurosci 7:719–725</ref>
 
===Models of [[sensory-motor coupling]] ===
<ref>Patrick D. Roberts, Christine V. Portfors (2008) Cell-specific, spike timing-dependent plasticities in the dorsal cochlear nucleus. Biological Cybernetics 98:491–507</ref>
====Neurophysiological metronomes: neural circuits for pattern generation====
Mutually [[inhibitory]] processes are a unifying motif of all [[central pattern generator]]s. This has been demonstrated in the stomatogastric (STG) nervous system of crayfish and lobsters.<ref>Michael P. Nusbaum and Mark P. Beenhakker, A small-systems approach to motor pattern generation, Nature 417, 343–350 (16 May 2002)</ref> Two and three-cell oscillating networks based on the STG have been constructed which are amenable to mathematical analysis, and which depend in a simple way on synaptic strengths and overall activity, presumably the knobs on these things.<ref>Cristina Soto-Treviño, Kurt A. Thoroughman and Eve Marder, L. F. Abbott, 2006. Activity-dependent modification of inhibitory synapses in models of rhythmic neural networks Nature Vol 4 No 3 2102–2121</ref> The mathematics involved is the theory of [[dynamical systems]].
 
====Feedback and Controlcontrol: Modelsmodels of Flightflight Controlcontrol in the Flyfly====
Flight control in the fly is believed to be mediated by inputs from the visual system and also the [[halteres]], a pair of knob-like organs which measure angular velocity. Integrated computer models of ''[[Drosophila]]'', short on neuronal circuitry but based on the general guidelines given by [[control theory]] and data from the tethered flights of flies, have been constructed to investigate the details of flight control.<ref>{{cite [web|url=http://wwwstrawlab.dickinson.caltech.eduorg/Research2011/Grand_Unified_Fly][03/23/grand-unified-fly/|title=the Grand Unified Fly (GUF) model}}</ref><ref>http://www.dickinsonmendeley.caltech.educom/Researchdownload/Grand_Unified_Fly?actionpublic/2464051/3652638122/d3bd7957efd2c8a011afb0687dfb6943731cb6d0/dl.pdf{{Dead link|date=AttachFile&doApril 2020 |bot=get&targetInternetArchiveBot |fix-attempted=Dickson_Straw_Poelma_Dickinson_2006.pdf].yes }}</ref>
 
====Cerebellum sensory motor control====
==Software Modelling Approaches and Tools==
[[Tensor network theory]] is a theory of [[cerebellum|cerebellar]] function that provides a mathematical model of the [[transformation geometry|transformation]] of sensory [[space-time]] coordinates into motor coordinates and vice versa by cerebellar [[neuronal networks]]. The theory was developed by Andras Pellionisz and [[Rodolfo Llinas]] in the 1980s as a [[geometrization]] of brain function (especially of the [[central nervous system]]) using [[tensor]]s.<ref name="Neuroscience1980-Pellionisz">{{Cite journal| author =Pellionisz, A., Llinás, R. | year =1980 | title =Tensorial Approach to the Geometry of Brain Function: Cerebellar Coordination Via A Metric Tensor | journal = Neuroscience | volume =5 | issue = 7| pages = 1125––1136 | url= https://www.academia.edu/download/31409354/pellionisz_1980_cerebellar_coordination_via_a_metric_tensor_fullpaper.pdf | doi = 10.1016/0306-4522(80)90191-8 | pmid=6967569| s2cid =17303132 }}{{dead link|date=July 2022|bot=medic}}{{cbignore|bot=medic}}</ref><ref name="Neuroscience1985-Pellionisz">{{Cite journal| author = Pellionisz, A., Llinás, R. | year =1985 | title= Tensor Network Theory of the Metaorganization of Functional Geometries in the Central Nervous System | journal = Neuroscience | volume =16 | issue =2 | pages = 245–273| doi = 10.1016/0306-4522(85)90001-6 | pmid = 4080158| s2cid =10747593 }}</ref>
===Neural Networks===
 
==Software modelling approaches and tools==
 
===Neural networks===
{{main|neural network}}
In this approach the strength and type, excitatory or inhibitory, of synaptic connections are represented by the magnitude and sign of weights, that is, numerical [[coefficients]] <math>w'</math> in front of the inputs ''<math>x''</math> to a particular neuron. The response of the ''<math>j''</math>-th neuron is given by a sum of nonlinear, usually "[[sigmoid function|sigmoidal]]" functions <math>g</math> of the inputs as:
 
<math>f_{j}=\sum_{i}g\left(w_{ji}'x_{i}+b_{j}\right)</math>.
 
This response is then fed as input into other neurons and so on. The goal is to solve foroptimize the weights givenof the neurons to output a desired response at the output layer respective to a set given inputs at the input layer. This optimization of the neuron weights is often performed using the [[Backpropagation| backpropagation algorithm]] and an optimization method such as [[gradient descent]] or [[Newton's method| Newton's method of optimization]]. Backpropagation compares the output of the network with the expected output from the training data, then updates the weights of each neuron to minimize the contribution of that individual neuron to the total error of the network.
 
===Genetic Algorithms===
[[Genetic algorithms]] are used to evolve neural (and sometimes body) properties in a model brain-body-environment system so as to exhibit some desired behavioral performance. The evolved agents can then be subjected to a detailed analysis to uncover their principles of operation. Evolutionary approaches are particularly useful for exploring spaces of possible solutions to a given behavioral task because these approaches minimize a priori assumptions about how a given behavior ought to be instantiated. They can also be useful for exploring different ways to complete a computational neuroethology model when only partial neural circuitry is available for a biological system of interest [http://www.scholarpedia.org/article/Computational_neuroethology].
 
===MATLABGenetic algorithms===
[[Genetic algorithms]] are used to evolve neural (and sometimes body) properties in a model brain-body-environment system so as to exhibit some desired behavioral performance. The evolved agents can then be subjected to a detailed analysis to uncover their principles of operation. Evolutionary approaches are particularly useful for exploring spaces of possible solutions to a given behavioral task because these approaches minimize a priori assumptions about how a given behavior ought to be instantiated. They can also be useful for exploring different ways to complete a computational neuroethology model when only partial neural circuitry is available for a biological system of interest.<ref>{{cite journal|title=Computational neuroethology|first1=Randall|last1=Beer|first2=Hillel|last2=Chiel|date=4 March 2008|volume=3|issue=3|doi=10.4249/scholarpedia.5307|journal=Scholarpedia|pages=5307|bibcode=2008SchpJ...3.5307B|doi-access=free}}</ref>
[[MATLAB]] is a programming environment that is used globally in virtually all Neuroscience and Cognitive Psychology laboratories [http://www.elsevier.com/wps/find/bookdescription.cws_home/716634/description#descriptiontitle]. MATLAB integrates the modelling and experimental processes by bringing together, under the aegis of an intuitive [[scripting language]], powerful data analysis and mathematical modelling tools.
 
===NEURON===
The [[Neuron (software)|NEURON]] software, developed at Duke University, is a simulation environment for modeling individual neurons and networks of neurons.<ref>{{cite [web|url=http://www.neuron.yale.edu/neuron/].|title=NEURON With- thefor empirically-based simulations of neurons and networks of neurons}}</ref> The NEURON environment is a self-contained environment allowing interface, itthrough isits possible[[Graphical touser generateinterface|GUI]] publication-qualityor resultsvia withoutscripting havingwith to[[hoc write(programming anylanguage)|hoc]] programor code[[Python at(programming alllanguage)|python]]. The NEURON simulation engine is based on a Hodgkin-HuxleyHodgkin–Huxley type model withusing a Borg-GrahamBorg–Graham formulation. Several examples of models written in NEURON are available from the online database ModelDB.<ref>McDougal RA, Morse TM, Carnevale T, Marenco L, Wang R, Migliore M, Miller PL, Shepherd GM, Hines ML.
Twenty years of ModelDB and beyond: building essential modeling tools for the future of neuroscience. J Comput Neurosci. 2017; 42(1):1–10.</ref>
 
==Embodiment in Electronicelectronic Hardwarehardware==
===Conductance Based Silicon Neurons===
Nervous systems differ from the majority of silicon-based computing devices in that they are [[analog computer|analog]], not [[digital]], and massively [[parallel computing|parallel]], not [[von Neumann architecture|sequential]]. To model nervous systems accurately, in real-time, alternative hardware is required.
 
===Conductance-based silicon neurons===
The most realistic circuits to date make use of analog properties of existing digital electronics (operated under non-standard conditions) to realize Hodgkin–Huxley-type models ''in silico''<ref>L. Alvadoa, J. Tomasa, S. Saghia, S. Renauda, T. Balb, A. Destexheb, G. Le Masson, 2004. Hardware computation of conductance-based neuron models. Neurocomputing 58–60 (2004) 109 – 115</ref>[http://www.scholarpedia.org/article/Silicon_neurons].
Nervous systems differ from the majority of silicon-based computing devices in that they resemble [[analog computer]]s (not [[digital data]] processors) and massively [[parallel computing|parallel]] processors, not [[von Neumann architecture|sequential]] processors. To model nervous systems accurately, in real-time, alternative hardware is required.
 
The most realistic circuits to date make use of [[analogue electronics|analog]] properties of existing [[digital electronics]] (operated under non-standard conditions) to realize Hodgkin–Huxley-type models ''in silico''.<ref>L. Alvadoa, J. Tomasa, S. Saghia, S. Renauda, T. Balb, A. Destexheb, G. Le Masson, 2004. Hardware computation of conductance-based neuron models. Neurocomputing 58–60 (2004) 109–115</ref><ref>{{cite journal|title=Silicon neurons|first1=Giacomo|last1=Indiveri|first2=Rodney|last2=Douglas|first3=Leslie|last3=Smith|date=29 March 2008|volume=3|issue=3|doi=10.4249/scholarpedia.1887|journal=Scholarpedia|pages=1887|bibcode=2008SchpJ...3.1887I|doi-access=free}}</ref>
===Retinomorphic Chips===
 
===Retinomorphic chips===
<ref>Kwabena Boahen, "A Retinomorphic Chip with Parallel Pathways: Encoding INCREASING, ON, DECREASING, and OFF Visual Signals", Analog Integrated Circuits and Signal Processing, 30, 121–135, 2002</ref>
 
==See also==
{{div col|colwidth=22em}}
[[Computational neuroscience]]
* [[Cognitive architecture]]
 
* [[Cognitive map]]
[[Motion perception]]
* [[Computational neuroscience]]
 
* [[Motion perception]]
[[Neuroethology]]
* [[Neural coding]]
 
* [[Neural correlate]]
[[Neuroinformatics]]
* [[Neural decoding]]
 
* [[Neuroethology]]
[[Systems neuroscience]]
* [[Neuroinformatics]]
* [[Quantitative models of the action potential]]
* [[Spiking neural network]]
* [[Systems neuroscience]]
{{div col end}}
 
==References==
Line 196 ⟶ 200:
 
==External links==
* [http://www.proberts.net/research/ Neural Dynamics at NSI] – Web page of Patrick D Roberts at the Neurological Sciences Institute
* [http://home.earthlink.net/~perlewitz/sftwr.html Computational Neuroscience -- Software ] - A list of commonly used modelling tools.
* [http://www.probertsdickinson.net/researchcaltech.edu/ Neural Dynamics at NSIDickinson Lab] - Web page of Patrickthe DDickinson Robertsgroup at theCaltech which studies flight Neurologicalcontrol Sciencesin Institute''Drosophila''
* [http://www.dickinson.caltech.edu/ Dickinson Lab ] - Web page of the Dickinson group at Caltech which studies flight control in ''Drosophila''
 
{{Neuroethology}}
{{NeuroethologyNavbox}}
{{animal cognition}}
{{Use dmy dates|date=January 2011}}
 
{{DEFAULTSORT:Models Of Neural Computation}}
[[Category:Ethology]]
[[Category:NeuroscienceComputational neuroscience]]
[[Category:Neurophysiology]]
[[Category:Zoology| neuroethology]]
[[Category:Neuroethology]]
 
[[de:Neuroethologie]]
[[es:Neuroetología]]
[[it:Neuroetologia]]
[[pl:Neuroetologia]]
[[sv:Neuroetologi]]
[[tr:Nöroetoloji]]
[[ur:عصبی سلوکیات]]