Content deleted Content added
Clovermoss (talk | contribs) m →Hypothesized coding schemes: formatting |
Citation bot (talk | contribs) Added bibcode. | Use this bot. Report bugs. | Suggested by Dominic3203 | Category:Computational neuroscience | #UCB_Category 76/122 |
||
Line 87:
Phase-of-firing code is a neural coding scheme that combines the [[action potential|spike]] count code with a time reference based on [[Neural oscillations|oscillations]]. This type of code takes into account a time label for each spike according to a time reference based on phase of local ongoing oscillations at low<ref name="Montemurro" /> or high frequencies.<ref name="Gamma cycle">{{cite journal |vauthors=Fries P, Nikolić D, Singer W |title=The gamma cycle |journal=Trends Neurosci. |volume=30 |issue=7 |pages=309–16 |date=July 2007 |pmid=17555828 |doi=10.1016/j.tins.2007.05.005 |s2cid=3070167 }}</ref>
It has been shown that neurons in some cortical sensory areas encode rich naturalistic stimuli in terms of their spike times relative to the phase of ongoing network oscillatory fluctuations, rather than only in terms of their spike count.<ref name="Montemurro">{{cite journal|doi=10.1016/j.cub.2008.02.023|pmid=18328702|title=Phase-of-Firing Coding of Natural Visual Stimuli in Primary Visual Cortex|journal=Current Biology|volume=18|issue=5|pages=375–380|year=2008|last1=Montemurro|first1=Marcelo A.|last2=Rasch|first2=Malte J.|last3=Murayama|first3=Yusuke|last4=Logothetis|first4=Nikos K.|last5=Panzeri|first5=Stefano|doi-access=free|bibcode=2008CBio...18..375M }}</ref><ref>[http://pop.cerco.ups-tlse.fr/fr_vers/documents/thorpe_sj_90_91.pdf Spike arrival times: A highly efficient coding scheme for neural networks] {{webarchive|url=https://web.archive.org/web/20120215151304/http://pop.cerco.ups-tlse.fr/fr_vers/documents/thorpe_sj_90_91.pdf |date=2012-02-15 }}, SJ Thorpe - Parallel processing in neural systems, 1990</ref> The [[local field potential]] signals reflect population (network) oscillations. The phase-of-firing code is often categorized as a temporal code although the time label used for spikes (i.e. the network oscillation phase) is a low-resolution (coarse-grained) reference for time. As a result, often only four discrete values for the phase are enough to represent all the information content in this kind of code with respect to the phase of oscillations in low frequencies. Phase-of-firing code is loosely based on the [[Place cell#Phase precession|phase precession]] phenomena observed in place cells of the [[hippocampus]]. Another feature of this code is that neurons adhere to a preferred order of spiking between a group of sensory neurons, resulting in firing sequence.<ref name="Firing sequences">{{cite journal |vauthors=Havenith MN, Yu S, Biederlack J, Chen NH, Singer W, Nikolić D |title=Synchrony makes neurons fire in sequence, and stimulus properties determine who is ahead |journal=J. Neurosci. |volume=31 |issue=23 |pages=8570–84 |date=June 2011 |pmid=21653861 |pmc=6623348 |doi=10.1523/JNEUROSCI.2817-10.2011 |doi-access=free }}</ref>
Phase code has been shown in visual cortex to involve also [[High frequency oscillations|high-frequency oscillations]].<ref name="Firing sequences" /> Within a cycle of gamma oscillation, each neuron has its own preferred relative firing time. As a result, an entire population of neurons generates a firing sequence that has a duration of up to about 15 ms.<ref name="Firing sequences"/>
Line 159:
The sparse code is when each item is encoded by the strong activation of a relatively small set of neurons. For each item to be encoded, this is a different subset of all available neurons. In contrast to sensor-sparse coding, sensor-dense coding implies that all information from possible sensor locations is known.
As a consequence, sparseness may be focused on temporal sparseness ("a relatively small number of time periods are active") or on the sparseness in an activated population of neurons. In this latter case, this may be defined in one time period as the number of activated neurons relative to the total number of neurons in the population. This seems to be a hallmark of neural computations since compared to traditional computers, information is massively distributed across neurons. Sparse coding of natural images produces [[wavelet]]-like oriented filters that resemble the [[receptive field]]s of simple cells in the visual cortex.<ref>{{cite journal | last1 = Olshausen | first1 = Bruno A | last2 = Field | first2 = David J | year = 1996 | title = Emergence of simple-cell receptive field properties by learning a sparse code for natural images | url = http://www.cs.ubc.ca/~little/cpsc425/olshausen_field_nature_1996.pdf | journal = Nature | volume = 381 | issue = 6583 | pages = 607–609 | doi = 10.1038/381607a0 | pmid = 8637596 | bibcode = 1996Natur.381..607O | s2cid = 4358477 | access-date = 2016-03-29 | archive-url = https://web.archive.org/web/20151123113216/http://www.cs.ubc.ca/~little/cpsc425/olshausen_field_nature_1996.pdf | archive-date = 2015-11-23 | url-status = dead }}</ref> The capacity of sparse codes may be increased by simultaneous use of temporal coding, as found in the locust olfactory system.<ref>{{cite journal|last1=Gupta|first1=N|last2=Stopfer|first2=M|title=A temporal channel for information in sparse sensory coding.|journal=Current Biology|date=6 October 2014|volume=24|issue=19|pages=2247–56|pmid=25264257|doi=10.1016/j.cub.2014.08.021|pmc=4189991|bibcode=2014CBio...24.2247G}}</ref>
Given a potentially large set of input patterns, sparse coding algorithms (e.g. [[Autoencoder#Sparse autoencoder (SAE)|sparse autoencoder]]) attempt to automatically find a small number of representative patterns which, when combined in the right proportions, reproduce the original input patterns. The sparse coding for the input then consists of those representative patterns. For example, the very large set of English sentences can be encoded by a small number of symbols (i.e. letters, numbers, punctuation, and spaces) combined in a particular order for a particular sentence, and so a sparse coding for English would be those symbols.
|