Cross-correlation matrix: Difference between revisions

Content deleted Content added
Bambaiah (talk | contribs)
 
(105 intermediate revisions by 69 users not shown)
Line 1:
{{other uses|Correlation function (disambiguation)}}
For [[stochastic process]]es, including those that arise in [[statistical mechanics]] and Euclidean [[quantum field theory]], a '''correlation function''' is the [[correlation]] between [[random variable]]s at two different points in space or time. If one considers the correlation function between random variables at the same point but at two different times then one refers to this as the '''autocorrelation function'''.
{{Multiple issues|
{{disputed|date=December 2018}}
{{More citations needed|date=December 2009}}
}}
{{Correlation and covariance}}
 
The '''cross-correlation matrix''' of two [[random vector]]s is a matrix containing as elements the cross-correlations of all pairs of elements of the random vectors. The cross-correlation matrix is used in various digital signal processing algorithms.
==Definitions==
 
==Definition==
Consider a [[probability density]] [[functional]] <b>P[X(s)]</b> for stochastic variables <b>X(s)</b> at different points <b>s</b> of some space, then the correlation function is
For two [[random vector]]s <math>\mathbf{X} = (X_1,\ldots,X_m)^{\rm T}</math> and <math>\mathbf{Y} = (Y_1,\ldots,Y_n)^{\rm T}</math>, each containing [[random element]]s whose [[expected value]] and [[variance]] exist, the '''cross-correlation matrix''' of <math>\mathbf{X}</math> and <math>\mathbf{Y}</math> is defined by<ref name=Gubner>{{cite book |first=John A. |last=Gubner |year=2006 |title=Probability and Random Processes for Electrical and Computer Engineers |publisher=Cambridge University Press |isbn=978-0-521-86470-1}}</ref>{{rp|p.337}}
:<math>C(s,s') = \langle X(s) X(s')\rangle</math>
where the statistical averages are taken with respect to the [[measure]] specified
by the probability density function.
 
{{Equation box 1
In this definition, it has been assumed that the stochastic variable is a scalar. If it is not, then one can define more complicated correlation functions. For example, if one has a vector <b>X<sub>i</sub>(s)</b>, then one can define the matrix of correlation functions
|indent =
:<math>C_{ij}(s,s') = \langle X_i(s) X_j(s') \rangle</math>
|title=
or a scalar, which is the trace of this matrix. If the probability density <b>P[X(s)]</b> has any target space symmetries, ie, symmetries in the space of the stochastic variable, then the correlation matrix will have induced symmetries.
|equation = <math>\operatorname{R}_{\mathbf{X}\mathbf{Y}} \triangleq\ \operatorname{E}[\mathbf{X} \mathbf{Y}^{\rm T}]</math>
|cellpadding= 6
|border
|border colour = #0073CF
|background colour=#F5FFFA}}
 
and has dimensions <math>m \times n</math>. Written component-wise:
A caveat, though, in [[quantum field theory]], we sometimes have nonpositive states and in that case, a probabilistic interpretation makes no sense. At any rate, even in ordinary quantum field theory, we need to work with quantum probability instead of classical [[probability]].
 
:<math>\operatorname{R}_{\mathbf{X}\mathbf{Y}} =
==Overview==
\begin{bmatrix}
''Ideally, it will make clear that the correlation functions in astronomy, financial market analysis, etc., are all instances of the '''same''' idea; therefore, a disambiguation page is not what this should be.''
\operatorname{E}[X_1 Y_1] & \operatorname{E}[X_1 Y_2] & \cdots & \operatorname{E}[X_1 Y_n] \\ \\
\operatorname{E}[X_2 Y_1] & \operatorname{E}[X_2 Y_2] & \cdots & \operatorname{E}[X_2 Y_n] \\ \\
\vdots & \vdots & \ddots & \vdots \\ \\
\operatorname{E}[X_m Y_1] & \operatorname{E}[X_m Y_2] & \cdots & \operatorname{E}[X_m Y_n] \\ \\
\end{bmatrix}
</math>
 
The random vectors <math>\mathbf{X}</math> and <math>\mathbf{Y}</math> need not have the same dimension, and either might be a scalar value.
 
==Example==
For example, if <math>\mathbf{X} = \left( X_1,X_2,X_3 \right)^{\rm T}</math> and <math>\mathbf{Y} = \left( Y_1,Y_2 \right)^{\rm T}</math> are random vectors, then
<math>\operatorname{R}_{\mathbf{X}\mathbf{Y}}</math> is a <math>3 \times 2</math> matrix whose <math>(i,j)</math>-th entry is <math>\operatorname{E}[X_i Y_j]</math>.
 
==Complex random vectors==
If <math>\mathbf{Z} = (Z_1,\ldots,Z_m)^{\rm T}</math> and <math>\mathbf{W} = (W_1,\ldots,W_n)^{\rm T}</math> are [[complex random vector]]s, each containing random variables whose expected value and variance exist, the cross-correlation matrix of <math>\mathbf{Z}</math> and <math>\mathbf{W}</math> is defined by
 
:<math>\operatorname{R}_{\mathbf{Z}\mathbf{W}} \triangleq\ \operatorname{E}[\mathbf{Z} \mathbf{W}^{\rm H}]</math>
 
where <math>{}^{\rm H}</math> denotes [[Hermitian transpose|Hermitian transposition]].
 
==Uncorrelatedness==
Two random vectors <math>\mathbf{X}=(X_1,\ldots,X_m)^{\rm T} </math> and <math>\mathbf{Y}=(Y_1,\ldots,Y_n)^{\rm T} </math> are called '''uncorrelated''' if
:<math>\operatorname{E}[\mathbf{X} \mathbf{Y}^{\rm T}] = \operatorname{E}[\mathbf{X}]\operatorname{E}[\mathbf{Y}]^{\rm T}.</math>
 
They are uncorrelated if and only if their cross-covariance matrix <math>\operatorname{K}_{\mathbf{X}\mathbf{Y}}</math> matrix is zero.
 
In the case of two [[complex random vector]]s <math>\mathbf{Z}</math> and <math>\mathbf{W}</math> they are called uncorrelated if
:<math>\operatorname{E}[\mathbf{Z} \mathbf{W}^{\rm H}] = \operatorname{E}[\mathbf{Z}]\operatorname{E}[\mathbf{W}]^{\rm H}</math>
and
:<math>\operatorname{E}[\mathbf{Z} \mathbf{W}^{\rm T}] = \operatorname{E}[\mathbf{Z}]\operatorname{E}[\mathbf{W}]^{\rm T}.</math>
 
==Properties==
===Relation to the cross-covariance matrix===
The cross-correlation is related to the ''cross-covariance matrix'' as follows:
:<math>\operatorname{K}_{\mathbf{X}\mathbf{Y}} = \operatorname{E}[(\mathbf{X} - \operatorname{E}[\mathbf{X}])(\mathbf{Y} - \operatorname{E}[\mathbf{Y}])^{\rm T}] = \operatorname{R}_{\mathbf{X}\mathbf{Y}} - \operatorname{E}[\mathbf{X}] \operatorname{E}[\mathbf{Y}]^{\rm T}</math>
: Respectively for complex random vectors:
:<math>\operatorname{K}_{\mathbf{Z}\mathbf{W}} = \operatorname{E}[(\mathbf{Z} - \operatorname{E}[\mathbf{Z}])(\mathbf{W} - \operatorname{E}[\mathbf{W}])^{\rm H}] = \operatorname{R}_{\mathbf{Z}\mathbf{W}} - \operatorname{E}[\mathbf{Z}] \operatorname{E}[\mathbf{W}]^{\rm H}</math>
 
==See also==
*[[CorrelationAutocorrelation]]
*[[Correlation does not imply causation]]
*[[Spearman's rank correlation coefficient]]
*[[Covariance function]]
*[[Pearson product-moment correlation coefficient]]
*[[Correlation function (astronomy)]]
*[[Correlation function (statistical mechanics)]]
*[[Correlation function (quantum field theory)]]
*[[Rate distortion theory#Rate-Distortion_Functions |Mutual information]]
*[[Rate distortion theory#Rate–distortion functions|Rate distortion theory]]
*[[Radial distribution function]]
 
==References==
{{reflist}}
 
==Further reading==
* Hayes, Monson H., ''Statistical Digital Signal Processing and Modeling'', John Wiley & Sons, Inc., 1996. {{ISBN|0-471-59431-8}}.
* Solomon W. Golomb, and [[Guang Gong]]. [http://www.cambridge.org/us/academic/subjects/computer-science/cryptography-cryptology-and-coding/signal-design-good-correlation-wireless-communication-cryptography-and-radar Signal design for good correlation: for wireless communication, cryptography, and radar]. Cambridge University Press, 2005.
* M. Soltanalian. [http://theses.eurasip.org/theses/573/signal-design-for-active-sensing-and/download/ Signal Design for Active Sensing and Communications]. Uppsala Dissertations from the Faculty of Science and Technology (printed by Elanders Sverige AB), 2014.
 
{{DEFAULTSORT:Correlation Function}}
{{Statistics-stub}}
[[Category:StatisticsCovariance and correlation]]
[[Category:Time series]]
[[Category:Spatial analysis]]
[[Category:Matrices (mathematics)]]
[[Category:Signal processing]]