Distance correlation: Difference between revisions

Content deleted Content added
m Properties: {{Ordered list |list_style_type=lower-roman}}
Line 113:
 
===Distance correlation===
{{Ordered list |list_style_type=lower-roman
 
(i)| <math>0\leq\operatorname{dCor}_n(X,Y)\leq1</math> and <math>0\leq\operatorname{dCor}(X,Y)\leq1</math>;
 
this is in contrast to Pearson's correlation, which can be negative.
 
(ii)| <math>\operatorname{dCor}(X,Y) = 0</math> if and only if <math>{{mvar|X</math>}} and <math>{{mvar|Y</math>}} are independent.
 
(iii) <math>\operatorname{dCor}_n(X,Y) = 1</math> implies that dimensions of the linear subspaces spanned by <math>X</math> and <math>Y</math> samples respectively are almost surely equal and if we assume that these subspaces are equal, then in this subspace <math>Y = A + b\,\mathbf{C}X</math> for some vector <math>A</math>, scalar <math>b</math>, and [[orthonormal matrix]] <math>\mathbf{C}</math>.
 
(iii)| <math>\operatorname{dCor}_n(X,Y) = 1</math> implies that dimensions of the linear subspaces spanned by <math>{{mvar|X</math>}} and <math>{{mvar|Y</math>}} samples respectively are almost surely equal and if we assume that these subspaces are equal, then in this subspace <math>Y = A + b\,\mathbf{C}X</math> for some vector <math>{{mvar|A</math>}}, scalar <math>{{mvar|b</math>}}, and [[orthonormal matrix]] <math>\mathbf{C}</math>.
}}
===Distance covariance===
{{Ordered list |list_style_type=lower-roman
(ii)| <math>\operatorname{dCov}^2(a_1 + b_1\,\mathbf{C}_1\,X, a_2 + b_2\,\mathbf{C}_2\,Y)\geq0</math> =and |b_1\,b_2|<math>\operatorname{dCov}^2_n(X,Y)\geq0</math>;
 
(i)| <math>\operatorname{dCov}^2(a_1 + b_1\,\mathbf{C}_1\,X, a_2 + b_2\,\mathbf{C}_2\,Y)\geq0</math> and= <math>|b_1\,b_2|\operatorname{dCov}_n^2(X,Y)\geq0</math>;
 
(ii) <math>\operatorname{dCov}^2(a_1 + b_1\,\mathbf{C}_1\,X, a_2 + b_2\,\mathbf{C}_2\,Y) = |b_1\,b_2|\operatorname{dCov}^2(X,Y)</math>
for all constant vectors <math>a_1, a_2</math>, scalars <math>b_1, b_2</math>, and orthonormal matrices <math>\mathbf{C}_1, \mathbf{C}_2</math>.
 
(iii)| If the random vectors <math>(X_1, Y_1)</math> and <math>(X_2, Y_2)</math> are independent then
:<math>
\operatorname{dCov}(X_1 + X_2, Y_1 + Y_2) \leq \operatorname{dCov}(X_1, Y_1) + \operatorname{dCov}(X_2, Y_2).
Line 135:
Equality holds if and only if <math>X_1</math> and <math>Y_1</math> are both constants, or <math>X_2</math> and <math>Y_2</math> are both constants, or <math>X_1, X_2, Y_1, Y_2</math> are mutually independent.
 
(iv)| <math>\operatorname{dCov}(X,Y) = 0</math> if and only if <math>{{mvar|X</math>}} and <math>{{mvar|Y</math>}} are independent.
}}
 
This last property is the most important effect of working with centered distances.
 
Line 151:
 
===Distance variance===
{{Ordered list |list_style_type=lower-roman
(ii)| <math>\operatorname{dVar}_n(X) = 0</math> if and only if every<math>X sample= observation\operatorname{E}[X]</math> isalmost identicalsurely.
 
(i)| <math>\operatorname{dVar}_n(X) = 0</math> if and only if <math>Xevery =sample \operatorname{E}[X]</math>observation almostis surelyidentical.
 
(ii) <math>\operatorname{dVar}_n(X) = 0</math> if and only if every sample observation is identical.
 
(iii) <math>\operatorname{dVar}(A + b\,\mathbf{C}\,X) = |b|\operatorname{dVar}(X)</math> for all constant vectors <math>A</math>, scalars <math>b</math>, and orthonormal matrices <math>\mathbf{C}</math>.
 
(iv) If <math>X</math> and <math>Y</math> are independent then| <math>\operatorname{dVar}(XA + Y) b\leq,\operatornamemathbf{dVarC}(\,X) += |b|\operatorname{dVar}(YX)</math> for all constant vectors {{mvar|A}}, scalars {{mvar|b}}, and orthonormal matrices <math>\mathbf{C}</math>.
 
| If {{mvar|X}} and {{mvar|Y}} are independent then <math>\operatorname{dVar}(X + Y) \leq\operatorname{dVar}(X) + \operatorname{dVar}(Y)</math>.
Equality holds in (iv) if and only if one of the random variables <math>X</math> or <math>Y</math> is a constant.
}}
Equality holds in (iv) if and only if one of the random variables <math>{{mvar|X</math>}} or <math>{{mvar|Y</math>}} is a constant.
 
==Generalization==