Content deleted Content added
m WP:CHECKWIKI error fix. Section heading problem. Violates WP:MOSHEAD. |
Rewrite to be more thorough. Tag: nowiki added |
||
Line 1:
In [[linear algebra]], a branch of [[mathematics]],
==
Let {{math|''A''}} be an {{math|''m'' × ''n''}} matrix with real or complex entries.{{efn|The definition, and the purely algebraic part of the theory, of compound matrices requires only that the matrix have entries in a [[commutative ring]]. In this case, the matrix corresponds to a homomorphism of finitely generated free modules.}} If {{math|''I''}} is a subset of {{math|{1, ..., ''m''<nowiki>}</nowiki>}} and {{math|''J''}} is a subset of {{math|{1, ..., ''n''<nowiki>}</nowiki>}}, then the '''{{math|(''I'', ''J'')}}-submatrix of {{math|''A''}}''', written {{math|''A''<sub>''I'', ''J''</sub>}}, is the submatrix formed from {{math|''A''}} by retaining only those rows indexed by {{math|''I''}} and those columns indexed by {{math|''J''}}. If {{math|1=''r'' = ''s''}}, then {{math|det ''A''<sub>''I'', ''J''</sub>}} is the '''{{math|(''I'', ''J'')}}-[[minor (linear algebra)|minor]]''' of {{math|''A''}}.
The '''''r''th compound matrix''' of {{math|''A''}} is a matrix, denoted {{math|''C''<sub>''r''</sub>(''A'')}}, is defined as follows. If {{math|''r'' > min(m, n)}}, then {{math|''C''<sub>''r''</sub>(''A'')}} is the unique {{math|0 × 0}} matrix. Otherwise, {{math|''C''<sub>''r''</sub>(''A'')}} has size <math display="inline">\binom{m}{r} \times \binom{n}{r}</math>. Its rows and columns are indexed by {{math|''r''}}-element subsets of {{math|{1, ..., ''m''<nowiki>}</nowiki>}} and {{math|{1, ..., ''n''<nowiki>}</nowiki>}}, respectively, in their lexicographic order. The entry corresponding to subsets {{math|''I''}} and {{math|''J''}} is the minor {{math|det ''A''<sub>''I'', ''J''</sub>}}.
In some applications of compound matrices, the precise ordering of the rows and columns is unimportant. For this reason, some authors do not specify how the rows and columns are to be ordered.<ref>Kung, Rota, and Yan, p. 305.</ref>
For example, consider the matrix
:<math>A = \begin{pmatrix} 1 & 2 & 3 & 4 \\ 5 & 6 & 7 & 8 \\ 9 & 10 & 11 & 12 \end{pmatrix}.</math>
The rows are indexed by {{math|{1, 2, 3<nowiki>}</nowiki>}} and the columns by {{math|{1, 2, 3, 4<nowiki>}</nowiki>}}. Therefore the rows of {{math|''C''<sub>2</sub>(''A'')}} are indexed by the sets
:<math>\{1, 2\} < \{1, 3\} < \{2, 3\}</math>
and the columns are indexed by
:<math>\{1, 2\} < \{1, 3\} < \{1, 4\} < \{2, 3\} < \{2, 4\} < \{3, 4\}.</math>
Using absolute value bars to denote determinants, the second compound matrix is
:<math>\begin{align}
C_2(A)
&= \begin{pmatrix}
\left|\begin{smallmatrix} 1 & 2 \\ 5 & 6 \end{smallmatrix}\right| &
\left|\begin{smallmatrix} 1 & 3 \\ 5 & 7 \end{smallmatrix}\right| &
\left|\begin{smallmatrix} 1 & 4 \\ 5 & 8 \end{smallmatrix}\right| &
\left|\begin{smallmatrix} 2 & 3 \\ 6 & 7 \end{smallmatrix}\right| &
\left|\begin{smallmatrix} 2 & 4 \\ 6 & 8 \end{smallmatrix}\right| &
\left|\begin{smallmatrix} 3 & 4 \\ 7 & 8 \end{smallmatrix}\right| \\
\left|\begin{smallmatrix} 1 & 2 \\ 9 & 10 \end{smallmatrix}\right| &
\left|\begin{smallmatrix} 1 & 3 \\ 9 & 11 \end{smallmatrix}\right| &
\left|\begin{smallmatrix} 1 & 4 \\ 9 & 12 \end{smallmatrix}\right| &
\left|\begin{smallmatrix} 2 & 3 \\ 10 & 11 \end{smallmatrix}\right| &
\left|\begin{smallmatrix} 2 & 4 \\ 10 & 12 \end{smallmatrix}\right| &
\left|\begin{smallmatrix} 3 & 4 \\ 11 & 12 \end{smallmatrix}\right| \\
\left|\begin{smallmatrix} 5 & 6 \\ 9 & 10 \end{smallmatrix}\right| &
\left|\begin{smallmatrix} 5 & 7 \\ 9 & 11 \end{smallmatrix}\right| &
\left|\begin{smallmatrix} 5 & 8 \\ 9 & 12 \end{smallmatrix}\right| &
\left|\begin{smallmatrix} 6 & 7 \\ 10 & 11 \end{smallmatrix}\right| &
\left|\begin{smallmatrix} 6 & 8 \\ 10 & 12 \end{smallmatrix}\right| &
\left|\begin{smallmatrix} 7 & 8 \\ 11 & 12 \end{smallmatrix}\right|
\end{pmatrix} \\
&= \begin{pmatrix}
-4 & -8 & -12 & -4 & -8 & -4 \\
-8 & -16 & -24 & -8 & -16 & -8 \\
-4 & -8 & -12 & -4 & -8 & -4
\end{pmatrix}.
\end{align}</math>
==Properties==
Let {{math|''c''}} be a scalar, {{math|''A''}} be an {{math|''m'' × ''n''}} matrix, and {{math|''B''}} be an {{math|''n'' × ''p''}} matrix. If {{math|''k''}} is a positive integer, then {{math|''I''<sub>''k''</sub>}} denotes the {{math|''k'' × ''k''}} identity matrix. The transpose of a matrix {{math|''M''}} will be written {{math|''M''{{i sup|T}}}}, and the conjugate transpose by {{math|''M''{{i sup|*}}}}. Then:<ref>Horn and Johnson, p. 22.</ref>
* {{math|1=''C''<sub>0</sub>(''A'') = ''I''<sub>1</sub>}}, a {{math|1 × 1}} identity matrix.
* {{math|1=''C''<sub>1</sub>(''A'') = ''A''}}.
* {{math|1=''C''<sub>''r''</sub>(''cA'') = ''c''{{i sup|''r''}}''C''<sub>''r''</sub>(''A'')}}.
* If {{math|1=rk ''A'' = ''r''}}, then {{math|1=rk C<sub>''r''</sub>(''A'') = 1}}.
* If {{math|1 ≤ ''r'' ≤ ''n''}}, then <math>C_r(I_n) = I_{\binom{n}{r}}</math>.
* If {{math|1 ≤ ''r'' ≤ min(m, n)}}, then {{math|1=''C''<sub>''r''</sub>(''A''{{i sup|T}}) = ''C''<sub>''r''</sub>(''A''){{i sup|T}}}}.
* If {{math|1 ≤ ''r'' ≤ min(m, n)}}, then {{math|1=''C''<sub>''r''</sub>(''A''{{i sup|*}}) = ''C''<sub>''r''</sub>(''A''){{i sup|*}}}}.
* {{math|1=''C''<sub>''r''</sub>(''AB'') = ''C''<sub>''r''</sub>(''A'')''C''<sub>''r''</sub>(''B'')}}.
* ([[Cauchy–Binet formula]]) {{math|1=det ''C''<sub>''r''</sub>(''AB'') = (det ''C''<sub>''r''</sub>(''A''))(det ''C''<sub>''r''</sub>(''B'')}}}.
Assume in addition that {{math|''A''}} is a square matrix of size {{math|''n''}}. Then:<ref>Horn and Johnson, pp. 22, 93, 147, 233.</ref>
* {{math|1=''C''<sub>''n''</sub>(''A'') = det ''A''}}.
* If {{math|''A''}} has one of the following properties, then so does {{math|''C''<sub>''r''</sub>(''A'')}}:
** Upper triangular,
** Lower triangular,
** Diagonal,
** Orthogonal,
** Unitary,
** Symmetric,
** Hermitian,
** Skew-symmetric,
** Skew-hermitian,
** Positive definite,
** Positive semi-definite,
** Normal.
* If {{math|''A''}} is invertible, then so is {{math|''C''<sub>''r''</sub>(''A'')}}, and {{math|1=''C''<sub>''r''</sub>(''A''{{i sup|−1}}) = ''C''<sub>''r''</sub>(''A''){{i sup|−1}}}}.
* (Sylvester–Franke theorem) If {{math|1 ≤ ''r'' ≤ ''n''}}, then <math>\det C_r(A) = (\det A)^{\binom{n-1}{r-1}}</math>.<ref name="Tornheim1952">{{cite journal|last1=Tornheim|first1=Leonard|title=The Sylvester-Franke Theorem|journal=The American Mathematical Monthly|volume=59|issue=6|year=1952|pages=389|issn=00029890|doi=10.2307/2306811}}</ref>
==Relation to exterior powers==
{{see also|Exterior algebra}}
Give {{math|'''R'''<sup>''n''</sup>}} the standard coordinate basis {{math|'''e'''<sub>1</sub>, ..., '''e'''<sub>''n''</sub>}}. The {{math|''r''}}th exterior power of {{math|'''R'''<sup>''n''</sup>}} is the vector space
:<math>\wedge^r \mathbf{R}^n</math>
whose basis consists of the formal symbols
:<math>\mathbf{e}_{i_1} \wedge \dots \wedge \mathbf{e}_{i_r},</math>
where
:<math>i_1 < \dots < i_r.</math>
Suppose that {{math|''A''}} be an {{math|''m'' × ''n''}} matrix. Then {{math|''A''}} corresponds to a linear transformation
:<math>A \colon \mathbf{R}^n \to \mathbf{R}^m.</math>
Taking the {{math|''r''}}th exterior power of this linear transformation determines a linear transformation
:<math>\wedge^r A \colon \wedge^r \mathbf{R}^n \to \wedge^r \mathbf{R}^m.</math>
The matrix corresponding to this linear transformation (with respect to the above bases of the exterior powers) is {{math|''C''<sub>''r''</sub>(''A'')}}. Taking exterior powers is a [[functor]], which means that<ref>Joseph P.S. Kung, Gian-Carlo Rota, and [[Catherine Yan|Catherine H. Yan]], ''Combinatorics: the Rota way'', Cambridge University Press, 2009, p. 306. {{isbn|9780521883894}}</ref>
:<math>\wedge^r (AB) = (\wedge^r A)(\wedge^r B).</math>
This corresponds to the formula {{math|1=''C''<sub>''r''</sub>(''AB'') = ''C''<sub>''r''</sub>(''A'')''C''<sub>''r''</sub>(''B'')}}. It is closely related to, and is a strengthening of, the [[Cauchy–Binet formula]].
==Relation to adjugate matrices==
{{see also|Adjugate matrix}}
Let {{math|''A''}} be an {{math|''n'' × ''n''}} matrix. Recall that its '''{{math|''r''}}th higher adjugate matrix''' {{math|adj<sub>''r''</sub>(''A'')}} is the <math display="inline">\binom{m}{r} \times \binom{n}{r}</math> matrix whose {{math|(''I'', ''J'')}} entry is
:<math>(-1)^{\sigma(I) + \sigma(J)} \det A_{I^c, J^c},</math>
where, for any set {{math|''K''}} of integers, {{math|''σ''(''K'')}} is the sum of the elements of {{math|''K''}}. The '''adjugate''' of {{math|''A''}} is its {{math|(''n'' − 1)}}st higher adjugate and is denoted {{math|adj(''A'')}}. The generalized [[Laplace expansion]] formula implies
:<math>C_r(A)\operatorname{adj}_r(A) = \operatorname{adj}_r(A)C_r(A) = (\det A)I_{\binom{n}{r}}.</math>
Adjugates can also be expressed in terms of compounds. Let {{math|''S''}} denote the ''sign matrix'':
:<math>S = \operatorname{diag}(1, -1, 1, -1, \ldots, (-1)^{n-1}),</math>
and let {{math|''J''}} denote the ''[[exchange matrix]]'':
:<math>J = \begin{pmatrix} & & 1 \\ & \cdots & \\ 1 & & \end{pmatrix}.</math>
Then Jacobi's theorem states that the {{math|''r''}}th higher adjugate matrix is:<ref name="NambiarSreevalsan2001">{{cite journal|last1=Nambiar|first1=K.K.|last2=Sreevalsan|first2=S.|title=Compound matrices and three celebrated theorems|journal=Mathematical and Computer Modelling|volume=34|issue=3-4|year=2001|pages=251–255|issn=08957177|doi=10.1016/S0895-7177(01)00058-9}}</ref><ref name="Price1947">{{cite journal|last1=Price|first1=G. B.|title=Some Identities in the Theory of Determinants|journal=The American Mathematical Monthly|volume=54|issue=2|year=1947|pages=75|issn=00029890|doi=10.2307/2304856}}</ref>
:<math>\operatorname{adj}_r(A) = JC_{n-r}(SAS)^TJ.</math>
It follows immediately from Jacobi's theorem that
:<math>C_r(A)\, J(C_{n-r}(SAS))^TJ = \det(A)I_{\binom{n}{r}}.</math>
Taking adjugates and compounds does not commute. However, compounds of adjugates can be expressed using adjugates of compounds, and vice versa. From the identities
:<math>C_r(C_s(A))C_r(\operatorname{adj}_s(A)) = (\det A)^rI,</math>
:<math>C_r(C_s(A))\operatorname{adj}_r(C_s(A)) = (\det C_s(A))I,</math>
and the Sylvester-Franke theorem, we deduce
:<math>\operatorname{adj}_r(C_s(A)) = (\det A)^{\binom{n-1}{s-1}-r} C_r(\operatorname{adj}_s(A)).</math>
The same technique leads to an additional identity,
:<math>\operatorname{adj}(C_r(A)) = (\det A)^{\binom{n-1}{r-1}-r} C_r(\operatorname{adj}(A)).</math>
== Applications ==
The computation of compound matrices appears in a wide array of problems.<ref>{{cite techreport|first=Boutin|last=D.L.|author2=R.F. Gleeson|author3=R.M. Williams|title=Wedge Theory / Compound Matrices: Properties and Applications.|institution=Office of Naval Research|url=http://handle.dtic.mil/100.2/ADA320264|year=1996|number=NAWCADPAX–96-220-TR}}</ref>
Compound and adjugate matrices appear when computing determinants of linear combinations of matrices. It is elementary to check that, if {{math|''A''}} and {{math|''B''}} are {{math|''n'' × ''n''}} matrices, then
:<math>\det(sA + tB) = C_n\left(\begin{bmatrix} sA & I_n \end{bmatrix}\right)C_n\left(\begin{bmatrix} I_n \\ tB \end{bmatrix}\right).</math>
:<math>\det(sA + tB) = \sum_{r=0}^n s^r t^{n-r} \operatorname{tr}(\operatorname{adj}_r(A)C_r(B)).</math>
This has the immediate consequence
:<math>\det(I + A) = \sum_{r=0}^n \operatorname{tr} \operatorname{adj}_r(A) = \sum_{r=0}^n \operatorname{tr} C_r(A).</math>
== Numerical computation ==
Line 78 ⟶ 136:
==Notes==
{{notelist}}
{{reflist}}
==References==
* Gantmacher, F. R. and Krein, M. G., ''Oscillation Matrices and Kernels and Small Vibrations of Mechanical Systems'', Revised Edition. American Mathematical Society, 2002. {{isbn|978-0-8218-3171-7}}
[[Category:Matrices]]
|