Correlation function: Difference between revisions

Content deleted Content added
Camrn86 (talk | contribs)
See also: annlink
 
(15 intermediate revisions by 12 users not shown)
Line 1:
{{OtherShort uses2description|Correlation as a function of distance}}
{{other uses}}
{{UnreferencedMore citations needed|date=December 20092023}}
[[File:Comparison_convolution_correlation.svg|thumb|300px|Visual comparison of [[convolution]], [[cross-correlation]] and [[autocorrelation]].]]
A '''correlation function''' is a [[function (mathematics)|function]] that gives the statistical [[correlation]] between [[random variable]]s, contingent on the spatial or temporal distance between those variables.<ref>{{cite Ifbook one considers the correlation function between random variables representing the same quantity measured at two different points then this is often referred to as an [[autocorrelation function]], which is made up of [[autocorrelation]]s. Correlation functions of different random variables are sometimes called '''cross-correlation functions''' to emphasize that different variables are being considered and because they are made up of [[cross-correlation]]s.
| last1=Pal | first1=Manoranjan
| last2=Bharati | first2=Premananda
| date=2019
| chapter=Introduction to Correlation and Linear Regression Analysis
| title=Applications of Regression Techniques
| url=https://link.springer.com/chapter/10.1007/978-981-13-9314-3_1
| publisher=Springer, Singapore
| pages=1–18
| doi=10.1007/978-981-13-9314-3_1
| access-date=December 14, 2023}}</ref> If one considers the correlation function between random variables representing the same quantity measured at two different points, then this is often referred to as an [[autocorrelation function]], which is made up of [[autocorrelation]]s. Correlation functions of different random variables are sometimes called '''cross-correlation functions''' to emphasize that different variables are being considered and because they are made up of [[cross-correlation]]s.
 
Correlation functions are a useful indicator of dependencies as a function of distance in time or space, and they can be used to assess the distance required between sample points for the values to be effectively uncorrelated. In addition, they can form the basis of rules for interpolating values at points for which there are no observations.
Line 21 ⟶ 32:
*'''rotational symmetry''' in addition to the above gives ''C''(''s'', ''s''<nowiki>'</nowiki>) = ''C''(|''s''&nbsp;&minus;&nbsp;''s''<nowiki>'</nowiki>|) where |''x''| denotes the norm of the vector ''x'' (for actual rotations this is the Euclidean or 2-norm).
 
Higher order correlation functions are often defined. A typical correlation function of order ''n'' is (the angle brackets represent the [[expectation value]])
 
:<math>C_{i_1i_2\cdots i_n}(s_1,s_2,\cdots,s_n) = \langle X_{i_1}(s_1) X_{i_2}(s_2) \cdots X_{i_n}(s_n)\rangle.</math>{{clarification needed|reason=explain bracket notation|date=July 2016}}
 
If the random vector has only one component variable, then the indices <math>i,j</math> are redundant. If there are symmetries, then the correlation function can be broken up into [[irreducible representation]]s of the symmetries &mdash; both internal and spacetime.
 
==Properties of probability distributions==
With these definitions, the study of correlation functions is similar to the study of [[probability distributions]]. Many stochastic processes can be completely characterized by their correlation functions; the most notable example is the class of [[Gaussian processes]].
 
Probability distributions defined on a finite number of points can always be normalized, but when these are defined over continuous spaces, then extra care is called for. The study of such distributions started with the study of [[random walk]]s and led to the notion of the [[Itō calculus]].
 
The Feynman [[path integral formulation|path integral]] in Euclidean space generalizes this to other problems of interest to [[statistical mechanics]]. Any probability distribution which obeys a condition on correlation functions called [[reflection positivity]] leads to a local [[quantum field theory]] after [[Wick rotation]] to [[Minkowski spacetime]] ( see [[Osterwalder-Schrader axioms]] ). The operation of [[renormalization]] is a specified set of mappings from the space of probability distributions to itself. A [[quantum field theory]] is called renormalizable if this mapping has a fixed point which gives a quantum field theory.
 
==See also==
*[[{{annotated link|Autocorrelation]]}}
*[[{{annotated link|Correlation does not imply causation]]}}
*{{annotated link|Correlogram}}
*[[{{annotated link|Covariance function]]}}
*[[{{annotated link|Pearson product-moment correlation coefficient]]}}
*[[Correlation function (astronomy)]]
*[[{{annotated link|Correlation function (statistical mechanicsastronomy)]]}}
*[[{{annotated link|Correlation function (quantum fieldstatistical theorymechanics)]]}}
*{{annotated link|Correlation function (quantum field theory)}}
*[[{{annotated link|Mutual information]]}}
*[[{{annotated link|Rate distortion theory#Rate–distortion_functionsRate–distortion functions|Rate distortion theory]]}}
*[[{{annotated link|Radial distribution function]]}}
 
==References==
{{reflist}}
 
{{Statistical mechanics topics}}
{{Authority control}}
 
{{DEFAULTSORT:Correlation Function}}
[[Category:Covariance and correlation]]
[[Category:Time series]]
[[Category:Spatial data analysis]]