Cross-correlation matrix: Difference between revisions

Content deleted Content added
m Fvultier moved page Correlation function to Cross-correlation matrix: All content of this article is there in the article Cross-correlation. This article however can be used to dicuss the cross-correlation matrix in a similar way as Autocorrelation matrix does.
Deleted duplicate content and copied content from Cross-correlation to this article.
Line 1:
{{Other uses2|Correlation function}}
{{Unreferenced|date=December 2009}}
[[File:Comparison_convolution_correlation.svg|thumb|300px|Visual comparison of [[convolution]], [[cross-correlation]] and [[autocorrelation]].]]
A '''correlation function''' is a [[function (mathematics)|function]] that gives the statistical [[correlation]] between [[random variable]]s, contingent on the spatial or temporal distance between those variables. If one considers the correlation function between random variables representing the same quantity measured at two different points then this is often referred to as an [[autocorrelation function]], which is made up of [[autocorrelation]]s. Correlation functions of different random variables are sometimes called '''cross-correlation functions''' to emphasize that different variables are being considered and because they are made up of [[cross-correlation]]s.
 
The '''cross-correlation matrix''' of two [[random vector|random vectors]] is a matrix containing as elements the cross-correlations of all pairs of elements of the random vectors. The cross-correlation matrixis used in various digital signal processing algorithms.
Correlation functions are a useful indicator of dependencies as a function of distance in time or space, and they can be used to assess the distance required between sample points for the values to be effectively uncorrelated. In addition, they can form the basis of rules for interpolating values at points for which there are no observations.
 
Correlation functions used in [[correlation function (astronomy)|astronomy]], [[financial analysis]], [[econometrics]], and [[statistical mechanics]] differ only in the particular stochastic processes they are applied to. In [[quantum field theory]] there are [[Correlation function (quantum field theory)|correlation functions over quantum distributions]].
 
==Definition==
For two [[random vector]]s <math>\mathbf{X} = (X_1,\ldots,X_m)^{\rm T}</math> and <math>\mathbf{Y} = (Y_1,\ldots,Y_n)^{\rm T}</math>, each containing [[random element]]s whose [[expected value]] and [[variance]] exist, the '''cross-correlation matrix''' of <math>\mathbf{X}</math> and <math>\mathbf{Y}</math> is defined by
For possibly distinct random variables ''X''(''s'') and ''Y''(''t'') at different points ''s'' and ''t'' of some space, the correlation function is
 
{{Equation box 1
|indent =
|title=
|equation = <math>\operatorname{R}_{\mathbf{X}\mathbf{Y}} \stackrel{\mathrm{def}}{=}\ \operatorname{E}[\mathbf{X} \mathbf{Y}^{\rm T}]</math>
|cellpadding= 6
|border
|border colour = #0073CF
|background colour=#F5FFFA}}
 
and has dimensions <math>m \times n</math>. Written component-wise:
 
:<math>\operatorname{R}_{\mathbf{X}\mathbf{Y}} =
\begin{bmatrix}
\operatorname{E}[X_1 Y_1] & \operatorname{E}[X_1 Y_2] & \cdots & \operatorname{E}[X_1 Y_n] \\ \\
\operatorname{E}[X_2 Y_1] & \operatorname{E}[X_2 Y_2] & \cdots & \operatorname{E}[X_2 Y_n] \\ \\
\vdots & \vdots & \ddots & \vdots \\ \\
\operatorname{E}[X_m Y_1] & \operatorname{E}[X_m Y_2] & \cdots & \operatorname{E}[X_m Y_n] \\ \\
\end{bmatrix}
</math>
 
The random vectors <math>\mathbf{X}</math> and <math>\mathbf{Y}</math> need not have the same dimension, and either might be a scalar value.
:<math>C(s,t) = \operatorname{corr} ( X(s), Y(t) ) ,</math>
 
==Example==
where <math>\operatorname{corr}</math> is described in the article on [[correlation]]. In this definition, it has been assumed that the stochastic variables are scalar-valued. If they are not, then more complicated correlation functions can be defined. For example, if ''X''(''s'') is a [[random vector]] with ''n'' elements and ''Y''(t) is a vector with ''q'' elements, then an ''n''×''q'' matrix of correlation functions is defined with <math>i,j</math> element
For example, if <math>\mathbf{X} = \left( X_1,X_2,X_3 \right)^{\rm T}</math> and <math>\mathbf{Y} = \left( Y_1,Y_2 \right)^{\rm T}</math> are random vectors, then
<math>\operatorname{R}_{\mathbf{X}\mathbf{Y}}</math> is a <math>3 \times 2</math> matrix whose <math>(i,j)</math>-th entry is <math>\operatorname{E}[X_i Y_j]</math>.
 
==cross-correlation matrix of complex random vectors==
:<math>C_{ij}(s,t) = \operatorname{corr}( X_i(s), Y_j(t) ).</math>
If <math>\mathbf{Z} = (Z_1,\ldots,Z_m)^{\rm T}</math> and <math>\mathbf{W} = (W_1,\ldots,W_n)^{\rm T}</math> are [[complex random vector|complex random vectors]], each containing random variables whose expected value and variance exist, the cross-correlation matrix of <math>\mathbf{Z}</math> and <math>\mathbf{W}</math> is defined by
 
:<math>\operatorname{R}_{\mathbf{Z}\mathbf{W}} \stackrel{\mathrm{def}}{=}\ \operatorname{E}[\mathbf{Z} \mathbf{W}^{\rm H}]</math>
When ''n''=''q'', sometimes the [[trace (matrix)|trace]] of this matrix is focused on. If the [[probability distribution]]s have any target space symmetries, i.e. symmetries in the value space of the stochastic variable (also called '''internal symmetries'''), then the correlation matrix will have induced symmetries. Similarly, if there are symmetries of the space (or time) ___domain in which the random variables exist (also called '''[[spacetime symmetries]]'''), then the correlation function will have corresponding space or time symmetries. Examples of important spacetime symmetries are &mdash;
*'''translational symmetry''' yields ''C''(''s'',''s''<nowiki>'</nowiki>) = ''C''(''s''&nbsp;&minus;&nbsp;''s''<nowiki>'</nowiki>) where ''s'' and ''s''<nowiki>'</nowiki> are to be interpreted as vectors giving coordinates of the points
*'''rotational symmetry''' in addition to the above gives ''C''(''s'', ''s''<nowiki>'</nowiki>) = ''C''(|''s''&nbsp;&minus;&nbsp;''s''<nowiki>'</nowiki>|) where |''x''| denotes the norm of the vector ''x'' (for actual rotations this is the Euclidean or 2-norm).
 
where <math>{}^{\rm H}</math> denotes [[Hermitian transpose|Hermitian transposition]].
Higher order correlation functions are often defined. A typical correlation function of order ''n'' is
 
==Uncorrelatedness==
:<math>C_{i_1i_2\cdots i_n}(s_1,s_2,\cdots,s_n) = \langle X_{i_1}(s_1) X_{i_2}(s_2) \cdots X_{i_n}(s_n)\rangle.</math>{{clarification needed|reason=explain bracket notation|date=July 2016}}
Two random vectors <math>\mathbf{X}=(X_1,...,X_m)^{\rm T} </math> and <math>\mathbf{Y}=(Y_1,...,Y_n)^{\rm T} </math> are called '''uncorrelated''' if
:<math>\operatorname{E}[\mathbf{X} \mathbf{Y}^{\rm T}] = \operatorname{E}[\mathbf{X}]\operatorname{E}[\mathbf{Y}]^{\rm T}</math>.
 
They are uncorrelated if and only if their covariance <math>\operatorname{K}_{\mathbf{X}\mathbf{Y}}</math> matrix is zero.
If the random vector has only one component variable, then the indices <math>i,j</math> are redundant. If there are symmetries, then the correlation function can be broken up into [[irreducible representation]]s of the symmetries &mdash; both internal and spacetime.
 
In the case of two [[complex random vector|complex random vectors]] <math>\mathbf{Z}</math> and <math>\mathbf{W}</math> they are called uncorrelated if
==Properties of probability distributions==
:<math>\operatorname{E}[\mathbf{Z} \mathbf{W}^{\rm H}] = \operatorname{E}[\mathbf{Z}]\operatorname{E}[\mathbf{W}]^{\rm H}</math>
With these definitions, the study of correlation functions is similar to the study of [[probability distributions]]. Many stochastic processes can be completely characterized by their correlation functions; the most notable example is the class of [[Gaussian processes]].
and
:<math>\operatorname{E}[\mathbf{Z} \mathbf{W}^{\rm T}] = \operatorname{E}[\mathbf{Z}]\operatorname{E}[\mathbf{W}]^{\rm T}</math>.
 
==Properties==
Probability distributions defined on a finite number of points can always be normalized, but when these are defined over continuous spaces, then extra care is called for. The study of such distributions started with the study of [[random walk]]s and led to the notion of the [[Itō calculus]].
* The ''cross-covariance matrix'' is related to the cross-correlation matrix as follows:
:<math>\operatorname{K}_{\mathbf{X}\mathbf{Y}} = \operatorname{E}[(\mathbf{X} - \operatorname{E}[\mathbf{X}])(\mathbf{Y} - \operatorname{E}[\mathbf{Y}])^{\rm T}] = \operatorname{R}_{\mathbf{X}\mathbf{Y}} - \operatorname{E}[\mathbf{X}] \operatorname{E}[\mathbf{Y}]^{\rm T}</math>
: Respectively for complex random vectors:
:<math>\operatorname{K}_{\mathbf{Z}\mathbf{W}} = \operatorname{E}[(\mathbf{Z} - \operatorname{E}[\mathbf{Z}])(\mathbf{W} - \operatorname{E}[\mathbf{W}])^{\rm H}] = \operatorname{R}_{\mathbf{Z}\mathbf{W}} - \operatorname{E}[\mathbf{Z}] \operatorname{E}[\mathbf{W}]^{\rm H}</math>
 
==References==
The Feynman [[path integral formulation|path integral]] in Euclidean space generalizes this to other problems of interest to [[statistical mechanics]]. Any probability distribution which obeys a condition on correlation functions called [[reflection positivity]] lead to a local [[quantum field theory]] after [[Wick rotation]] to [[Minkowski spacetime]]. The operation of [[renormalization]] is a specified set of mappings from the space of probability distributions to itself. A [[quantum field theory]] is called renormalizable if this mapping has a fixed point which gives a quantum field theory.
* Hayes, Monson H., ''Statistical Digital Signal Processing and Modeling'', John Wiley & Sons, Inc., 1996. {{ISBN|0-471-59431-8}}.
* Solomon W. Golomb, and Guang Gong. [http://www.cambridge.org/us/academic/subjects/computer-science/cryptography-cryptology-and-coding/signal-design-good-correlation-wireless-communication-cryptography-and-radar Signal design for good correlation: for wireless communication, cryptography, and radar]. Cambridge University Press, 2005.
* M. Soltanalian. [http://theses.eurasip.org/theses/573/signal-design-for-active-sensing-and/download/ Signal Design for Active Sensing and Communications]. Uppsala Dissertations from the Faculty of Science and Technology (printed by Elanders Sverige AB), 2014.
 
==See also==
Line 50 ⟶ 78:
[[Category:Time series]]
[[Category:Spatial data analysis]]
[[Category:Matrices]]
[[Category:Signal processing]]