Content deleted Content added
Rescuing 1 sources and tagging 0 as dead.) #IABot (v2.0.9.5 |
|||
(2 intermediate revisions by 2 users not shown) | |||
Line 41:
=== EEG / MEG ===
DCM for EEG and MEG data use more biologically detailed neural models than fMRI, due to the higher temporal resolution of these measurement techniques. These can be classed into physiological models, which recapitulate neural circuitry, and phenomenological models, which focus on reproducing particular data features. The physiological models can be further subdivided into two classes. [http://www.scholarpedia.org/article/Conductance-based_models Conductance-based models] derive from the equivalent circuit representation of the cell membrane developed by Hodgkin and Huxley in the 1950s.<ref name="Hodgkin 1952">{{Cite journal|last1=Hodgkin|first1=A. L.|last2=Huxley|first2=A. F.|date=1952-04-28|title=The components of membrane conductance in the giant axon ofLoligo|journal=The Journal of Physiology|volume=116|issue=4|pages=473–496|doi=10.1113/jphysiol.1952.sp004718|pmid=14946714|issn=0022-3751|pmc=1392209}}</ref> Convolution models were introduced by [[Wilson–Cowan model|Wilson & Cowan]]<ref>{{Cite journal|author2-link=Jack D. Cowan|last1=Wilson|first1=H. R.|last2=Cowan|first2=J. D.|date=September 1973|title=A mathematical theory of the functional dynamics of cortical and thalamic nervous tissue|journal=Kybernetik|volume=13|issue=2|pages=55–80|doi=10.1007/bf00288786|pmid=4767470|s2cid=292546|issn=0340-1200}}</ref> and Freeman <ref>{{Cite book|date=1975|title=Mass Action in the Nervous System|doi=10.1016/c2009-0-03145-6|isbn=9780122671500|last1=Freeman|first1=Walter J}}</ref> in the 1970s and involve a convolution of pre-synaptic input by a synaptic kernel function. Some of the specific models used in DCM are as follows:
* Physiological models:
Line 56:
== Model estimation ==
Model inversion or estimation is implemented in DCM using [[Variational Bayesian methods|variational Bayes]] under the [[Laplace's method|Laplace assumption]].<ref>{{Citation|last1=Friston|first1=K.|date=2007|pages=606–618|publisher=Elsevier|isbn=9780123725608|last2=Mattout|first2=J.|last3=Trujillo-Barreto|first3=N.|last4=Ashburner|first4=J.|last5=Penny|first5=W.|doi=10.1016/b978-012372560-8/50047-4|chapter=Variational Bayes under the Laplace approximation|title=Statistical Parametric Mapping}}</ref> This provides two useful quantities: the log marginal likelihood or model evidence <math>\ln{p(y|m)}</math> is the probability of observing of the data under a given model. Generally, this cannot be calculated explicitly and is approximated by a quantity called the negative variational free energy <math>F</math>, referred to in machine learning as the Evidence Lower Bound (ELBO). Hypotheses are tested by comparing the evidence for different models based on their free energy, a procedure called Bayesian model comparison.
Model estimation also provides estimates of the parameters <math>p(\theta|y)</math>, for example connection strengths, which maximise the free energy. Where models differ only in their priors, [[Bayesian model reduction|Bayesian Model Reduction]] can be used to derive the evidence and parameters of nested or reduced models analytically and efficiently.
Line 92:
== Further reading ==
{{Scholia}}
* [http://www.scholarpedia.org/article/Dynamic_causal_modeling Dynamic Causal Modelling on Scholarpedia]
* Understanding DCM: ten simple rules for the clinician<ref>{{Cite journal|last1=Kahan|first1=Joshua|last2=Foltynie|first2=Tom|date=December 2013|title=Understanding DCM: Ten simple rules for the clinician|journal=NeuroImage|volume=83|pages=542–549|doi=10.1016/j.neuroimage.2013.07.008|pmid=23850463|issn=1053-8119|doi-access=free}}</ref>
|