Compound matrix: Difference between revisions

Content deleted Content added
added Sylvester-Franke theorem. added connections with adjugates including Jacobi's Theorem.
OAbot (talk | contribs)
m Open access bot: url-access=subscription updated in citation with #oabot.
 
(40 intermediate revisions by 24 users not shown)
Line 1:
{{Short description|Matrix whose entries are all minors of another matrix}}
In mathematics, the ''k''th '''compound matrix''' (sometimes referred to as the ''k''th '''''multiplicative'' compound matrix''') <math>C_k(A)</math>,<ref>R.A. Horn and C.R. Johnson, ''Matrix Analysis'', Cambridge University Press, 1990, pp. 19–20. {{isbn|9780521386326}}</ref> of an <math>m\times n</math> [[matrix (mathematics)|matrix]] ''A'' is the <math>\binom m k\times \binom n k</math> matrix formed from the [[determinant]]s of all <math> k\times k</math> submatrices of ''A'', i.e., all <math>k\times k</math> minors, arranged with the submatrix index sets in [[lexicographic order]].
In [[linear algebra]], a branch of [[mathematics]], a ('''multiplicative''') '''compound matrix''' is a [[matrix (mathematics)|matrix]] whose entries are all [[minor (linear algebra)|minors]], of a given size, of another matrix.<ref>DeAlba, Luz M. ''Determinants and Eigenvalues'' in Hogben, Leslie (ed) ''Handbook of Linear Algebra'', 2nd edition, CRC Press, 2013, {{isbn|978-1-4665-0729-6}}, p. 4-4</ref><ref>Gantmacher, F. R., ''The Theory of Matrices'', volume I, Chelsea Publishing Company, 1959, {{isbn|978-0-8218-1376-8}}p. 20</ref><ref>Horn, Roger A. and Johnson, Charles R., ''Matrix Analysis'', 2nd edition, Cambridge University Press, 2013, {{isbn|978-0-521-54823-6}}, p. 21</ref><ref name=":0">{{Cite journal|last=Muldowney|first=James S.|date=1990|title=Compound matrices and ordinary differential equations|url=http://projecteuclid.org/euclid.rmjm/1181073047|journal=Rocky Mountain Journal of Mathematics|language=en|volume=20|issue=4|pages=857–872|doi=10.1216/rmjm/1181073047|issn=0035-7596|via=|doi-access=free}}</ref> Compound matrices are closely related to [[exterior algebra]]s,<ref>{{cite tech report|first=Boutin|last=D.L.|author2=R.F. Gleeson|author3=R.M. Williams|title=Wedge Theory / Compound Matrices: Properties and Applications.|institution=Office of Naval Research|url=https://apps.dtic.mil/sti/pdfs/ADA320264.pdf|archive-url=https://web.archive.org/web/20210116083905/https://apps.dtic.mil/sti/pdfs/ADA320264.pdf|url-status=live|archive-date=January 16, 2021|year=1996|number=NAWCADPAX–96-220-TR}}</ref> and their computation appears in a wide array of problems, such as in the analysis of nonlinear time-varying dynamical systems and generalizations of positive systems, cooperative systems and contracting systems.<ref name=":0" /><ref>{{Cite journal |last1=Bar-Shalom |first1=Eyal |last2=Dalin |first2=Omri |last3=Margaliot |first3=Michael |date=2023-03-15 |title=Compound matrices in systems and control theory: a tutorial |url=https://link.springer.com/10.1007/s00498-023-00351-8 |journal=Mathematics of Control, Signals, and Systems |volume=35 |issue=3 |pages=467–521 |language=en |doi=10.1007/s00498-023-00351-8 |arxiv=2204.00676 |bibcode=2023MCSS...35..467B |s2cid=247939832 |issn=0932-4194}}</ref>
 
== PropertiesDefinition ==
Let ''a'' be a complex number, ''A'' be a ''m'' × ''n'' complex matrix, ''B'' be a ''n'' × ''p'' complex matrix and I<sub>n</sub> the identity matrix of order ''n'' × ''n''.
 
Let {{math|''A''}} be an {{math|''m''&thinsp;×&thinsp;''n''}} matrix with [[real number|real]] or [[complex number|complex]] entries.{{efn|The definition, and the purely algebraic part of the theory, of compound matrices requires only that the matrix have entries in a [[commutative ring]]. In this case, the matrix corresponds to a [[module homomorphism|homomorphism]] of [[finitely generated module|finitely generated]] [[free module]]s.}} If {{math|''I''}} is a [[subset]] of size {{math|''r''}} of {{math|{1, ..., ''m''<nowiki>}</nowiki>}} and {{math|''J''}} is a subset of size {{math|''s''}} of {{math|{1, ..., ''n''<nowiki>}</nowiki>}}, then the '''{{math|(''I'', ''J''{{hairsp}})}}-submatrix of {{math|''A''}}''', written {{math|''A''<sub>''I'', ''J''</sub>}}{{hairsp}}, is the submatrix formed from {{math|''A''}} by retaining only those rows indexed by {{math|''I''}} and those columns indexed by {{math|''J''}}. If {{math|1=''r'' = ''s''}}, then {{math|det&thinsp;''A''<sub>''I'', ''J''</sub>}} is the '''{{math|(''I'', ''J''{{hairsp}})}}-[[minor (linear algebra)|minor]]''' of {{math|''A''}}.
The following properties hold:
* <math>C_1(A) = A</math>
* If ''m'' = ''n'' (that is, ''A'' is a square matrix), then <math>C_n(A) = \det(A)</math>
* <math>C_k(AB) = C_k(A) C_k(B)</math>
* <math>C_k(a\,A) = a^{k} C_k(A)</math>
* <math>C_k(I_n) = I_{\binom{n}{k}} </math>
* <math>C_k(A^{*}) = C_k(A)^{*}</math>
For square <math>A</math>, <math>n\times n</math>:
* If ''A'' is invertible, then <math>C_k(A^{-1}) = C_k(A)^{-1}</math>
 
The '''''r''{{hairsp}}th compound matrix''' of {{math|''A''}} is a matrix, denoted {{math|''C''<sub>''r''&thinsp;</sub>(''A'')}}, is defined as follows. If {{math|''r'' > min(''m'', ''n'')}}, then {{math|''C''<sub>''r''&thinsp;</sub>(''A'')}} is the unique {{math|0&thinsp;×&thinsp;0}} matrix. Otherwise, {{math|''C''<sub>''r''&thinsp;</sub>(''A'')}} has size <math display="inline">\binom{m}{r} \!\times\! \binom{n}{r}</math>. Its rows and columns are indexed by {{math|''r''}}-element subsets of {{math|{1, ..., ''m''<nowiki>}</nowiki>}} and {{math|{1, ..., ''n''<nowiki>}</nowiki>}}, respectively, in their [[lexicographic order]]. The entry corresponding to subsets {{math|''I''}} and {{math|''J''}} is the minor {{math|det&thinsp;''A''<sub>''I'', ''J''</sub>}}.
* <math>\det(C_k(A))=(\det A)^{\binom{n-1}{k-1}}</math> (Sylvester-Franke Theorem)
<ref name="Tornheim1952">{{cite journal|last1=Tornheim|first1=Leonard|title=The Sylvester-Franke Theorem|journal=The American Mathematical Monthly|volume=59|issue=6|year=1952|pages=389|issn=00029890|doi=10.2307/2306811}}</ref>
 
In some applications of compound matrices, the precise ordering of the rows and columns is unimportant. For this reason, some authors do not specify how the rows and columns are to be ordered.<ref>Kung, Rota, and Yan, p. 305.</ref>
As in
<ref name="NambiarSreevalsan2001">{{cite journal|last1=Nambiar|first1=K.K.|last2=Sreevalsan|first2=S.|title=Compound matrices and three celebrated theorems|journal=Mathematical and Computer Modelling|volume=34|issue=3-4|year=2001|pages=251–255|issn=08957177|doi=10.1016/S0895-7177(01)00058-9}}</ref>, introduce the ''sign matrix''
<math>S=</math> diagonal matrix with entries alternating <math>\pm1</math>
with <math>S_{11}=1</math>. And the ''reversal matrix'' <math>J</math> with 1's on the antidiagonal
and zeros elsewhere.
 
For example, consider the matrix
:<math>A = \begin{pmatrix} 1 & 2 & 3 & 4 \\ 5 & 6 & 7 & 8 \\ 9 & 10 & 11 & 12 \end{pmatrix}.</math>
The rows are indexed by {{math|{1, 2, 3<nowiki>}</nowiki>}} and the columns by {{math|{1, 2, 3, 4<nowiki>}</nowiki>}}. Therefore, the rows of {{math|''C''<sub>2{{hairsp}}</sub>(''A'')}} are indexed by the sets
:<math>\{1, 2\} < \{1, 3\} < \{2, 3\}</math>
and the columns are indexed by
:<math>\{1, 2\} < \{1, 3\} < \{1, 4\} < \{2, 3\} < \{2, 4\} < \{3, 4\}.</math>
Using absolute value bars to denote [[determinant]]s, the second compound matrix is
:<math>\begin{align}
C_2(A)
&= \begin{pmatrix}
\left|\begin{smallmatrix} 1 & 2 \\ 5 & 6 \end{smallmatrix}\right| &
\left|\begin{smallmatrix} 1 & 3 \\ 5 & 7 \end{smallmatrix}\right| &
\left|\begin{smallmatrix} 1 & 4 \\ 5 & 8 \end{smallmatrix}\right| &
\left|\begin{smallmatrix} 2 & 3 \\ 6 & 7 \end{smallmatrix}\right| &
\left|\begin{smallmatrix} 2 & 4 \\ 6 & 8 \end{smallmatrix}\right| &
\left|\begin{smallmatrix} 3 & 4 \\ 7 & 8 \end{smallmatrix}\right| \\
\left|\begin{smallmatrix} 1 & 2 \\ 9 & 10 \end{smallmatrix}\right| &
\left|\begin{smallmatrix} 1 & 3 \\ 9 & 11 \end{smallmatrix}\right| &
\left|\begin{smallmatrix} 1 & 4 \\ 9 & 12 \end{smallmatrix}\right| &
\left|\begin{smallmatrix} 2 & 3 \\ 10 & 11 \end{smallmatrix}\right| &
\left|\begin{smallmatrix} 2 & 4 \\ 10 & 12 \end{smallmatrix}\right| &
\left|\begin{smallmatrix} 3 & 4 \\ 11 & 12 \end{smallmatrix}\right| \\
\left|\begin{smallmatrix} 5 & 6 \\ 9 & 10 \end{smallmatrix}\right| &
\left|\begin{smallmatrix} 5 & 7 \\ 9 & 11 \end{smallmatrix}\right| &
\left|\begin{smallmatrix} 5 & 8 \\ 9 & 12 \end{smallmatrix}\right| &
\left|\begin{smallmatrix} 6 & 7 \\ 10 & 11 \end{smallmatrix}\right| &
\left|\begin{smallmatrix} 6 & 8 \\ 10 & 12 \end{smallmatrix}\right| &
\left|\begin{smallmatrix} 7 & 8 \\ 11 & 12 \end{smallmatrix}\right|
\end{pmatrix} \\
&= \begin{pmatrix}
-4 & -8 & -12 & -4 & -8 & -4 \\
-8 & -16 & -24 & -8 & -16 & -8 \\
-4 & -8 & -12 & -4 & -8 & -4
\end{pmatrix}.
\end{align}</math>
 
==Properties==
 
Let {{math|''c''}} be a scalar, {{math|''A''}} be an {{math|''m''&thinsp;×&thinsp;''n''}} matrix, and {{math|''B''}} be an {{math|''n''&thinsp;×&thinsp;''p''}} matrix. For {{math|''k''}} a positive [[integer]], let {{math|''I''<sub>''k''</sub>}} denote the {{math|''k''&thinsp;×&thinsp;''k''}} [[identity matrix]]. The [[transpose]] of a matrix {{math|''M''}} will be written {{math|''M''{{i sup|T}}}}, and the [[conjugate transpose]] by {{math|''M''{{i sup|*}}}}. Then:<ref>Horn and Johnson, p. 22.</ref>
* <math>C_k(A)^{-1}=\det(A)^{-1}J(C_{n-k}(SAS))^TJ</math> (see below)
 
* {{math|1=''C''<sub>0{{hairsp}}</sub>(''A'') = ''I''<sub>1</sub>}}, a {{math|1&thinsp;×&thinsp;1}} identity matrix.
<!-- Assume <math>\det(A) \ne 0</math>.-->
* {{math|1=''C''<sub>1</sub>(''A'') = ''A''}}.
== Compound matrices and adjugates ==
* {{math|1=''C''<sub>''r''&thinsp;</sub>(''cA'') = ''c''{{i sup|''r''}}''C''<sub>''r''&thinsp;</sub>(''A'')}}.
[See <ref name="Price1947">{{cite journal|last1=Price|first1=G. B.|title=Some Identities in the Theory of Determinants|journal=The American Mathematical Monthly|volume=54|issue=2|year=1947|pages=75|issn=00029890|doi=10.2307/2304856}}</ref>
* If {{math|1=rk ''A'' = ''r''}}, then {{math|1=rk C<sub>''r''&thinsp;</sub>(''A'') = 1}}.
for a classical discussion related to this section.]
* If {{math|1&thinsp;≤ ''r'' ≤ ''n''}}, then <math>C_r(I_n) = I_{\binom{n}{r}}</math>.
* If {{math|1&thinsp;≤ ''r'' ≤ min(''m'', ''n'')}}, then {{math|1=''C''<sub>''r''&thinsp;</sub>(''A''{{i sup|T}}) = ''C''<sub>''r''&thinsp;</sub>(''A''){{i sup|T}}}}.
* If {{math|1&thinsp;≤ ''r'' ≤ min(''m'', ''n'')}}, then {{math|1=''C''<sub>''r''&thinsp;</sub>(''A''<sup>*</sup>) = ''C''<sub>''r''&thinsp;</sub>(''A'')<sup>*</sup>}}.
* {{math|1=''C''<sub>''r''&thinsp;</sub>(''AB'') = ''C''<sub>''r''&thinsp;</sub>(''A''){{hairsp}}''C''<sub>''r''&thinsp;</sub>(''B'')}}, which is closely related to [[Cauchy–Binet formula]].
 
Assume in addition that {{math|''A''}} is a [[square matrix]] of size {{math|''n''}}. Then:<ref>Horn and Johnson, pp. 22, 93, 147, 233.</ref>
Recall the [[adjugate matrix]] is the transpose of the matrix of cofactors, signed minors complementary to single entries. Then we can write
{{NumBlk|:|<math>\operatorname{adj}(A)=JC_{n-1}(SAS)^TJ</math>|{{EquationRef|1}}}}
 
* {{math|1=''C''<sub>''n''{{hairsp}}</sub>(''A'') = det&thinsp;''A''}}.
with <math>T</math> denoting transpose.
* If {{math|''A''}} has one of the following properties, then so does {{math|''C''<sub>''r''&thinsp;</sub>(''A'')}}:
** [[Upper triangular]],
** [[Lower triangular]],
** [[Diagonal matrix|Diagonal]],
** [[Orthogonal matrix|Orthogonal]],
** [[Unitary matrix|Unitary]],
** [[Symmetric matrix|Symmetric]],
** [[Hermitian matrix|Hermitian]],
** [[Skew-symmetric matrix|Skew-symmetric]] (when r is odd),
** [[Skew-hermitian]] (when r is odd),
** [[Positive definite matrix|Positive definite]],
** [[Positive semi-definite matrix|Positive semi-definite]],
** [[Normal matrix|Normal]].
* If {{math|''A''}} is [[invertible matrix|invertible]], then so is {{math|''C''<sub>''r''&thinsp;</sub>(''A'')}}, and {{math|1=''C''<sub>''r''&thinsp;</sub>(''A''{{i sup|&minus;1}}) = ''C''<sub>''r''&thinsp;</sub>(''A''){{i sup|−1}}}}.
* (Sylvester–Franke theorem) If {{math|1&thinsp;≤ ''r'' ≤ ''n''}}, then <math>\det C_r(A) = (\det A)^{\binom{n-1}{r-1}}</math>.<ref name="Tornheim1952">{{cite journal|last1=Tornheim|first1=Leonard|title=The Sylvester–Franke Theorem|journal=The American Mathematical Monthly|volume=59|issue=6|year=1952|pages=389–391|issn=0002-9890|doi=10.2307/2306811|jstor=2306811}}</ref><ref>[[Harley Flanders]] (1953) "A Note on the Sylvester-Franke Theorem", [[American Mathematical Monthly]] 60: 543–5, {{mr|id=0057835}}</ref>
 
==Relation to exterior powers==
The basic property of the adjugate is the relation
{{see also|Exterior algebra}}
 
Give {{math|'''R'''<sup>''n''</sup>}} the [[canonical basis|standard coordinate basis]] {{math|'''e'''<sub>1</sub>, ..., '''e'''<sub>''n''</sub>}}. The {{math|''r''}}{{hairsp}}th exterior power of {{math|'''R'''<sup>''n''</sup>}} is the [[vector space]]
<math>A\operatorname{adj}(A)=\det(A) I</math>,
:<math>\wedge^r \mathbf{R}^n</math>
whose [[basis (linear algebra)|basis]] consists of the formal symbols
:<math>\mathbf{e}_{i_1} \wedge \dots \wedge \mathbf{e}_{i_r},</math>
where
:<math>i_1 < \dots < i_r.</math>
 
Suppose that {{math|''A''}} is an {{math|''m''&thinsp;×&thinsp;''n''}} matrix. Then {{math|''A''}} corresponds to a [[linear transformation]]
hence <math>C_k(A)C_k(\operatorname{adj}(A))=\det(A)^k I</math> while
:<math>A \colon \mathbf{R}^n \to \mathbf{R}^m.</math>
Taking the {{math|''r''}}{{hairsp}}th exterior power of this linear transformation determines a linear transformation
:<math>\wedge^r A \colon \wedge^r \mathbf{R}^n \to \wedge^r \mathbf{R}^m.</math>
The matrix corresponding to this linear transformation (with respect to the above bases of the exterior powers) is {{math|''C''<sub>''r''&thinsp;</sub>(''A'')}}. Taking exterior powers is a [[functor]], which means that<ref>Joseph P.S. Kung, Gian-Carlo Rota, and [[Catherine Yan|Catherine H. Yan]], ''[[Combinatorics: The Rota Way]]'', Cambridge University Press, 2009, p. 306. {{isbn|9780521883894}}</ref>
:<math>\wedge^r (AB) = (\wedge^r A)(\wedge^r B).</math>
This corresponds to the formula {{math|1=''C''<sub>''r''&thinsp;</sub>(''AB'') = ''C''<sub>''r''&thinsp;</sub>(''A'')''C''<sub>''r''&thinsp;</sub>(''B'')}}. It is closely related to, and is a strengthening of, the [[Cauchy–Binet formula]].
 
==Relation to adjugate matrices==
{{NumBlk|:|<math>C_k(A)\operatorname{adj}(C_k(A))=\det(C_k(A)) I</math>|{{EquationRef|2}}}}
{{see also|Adjugate matrix}}
 
Let {{math|''A''}} be an {{math|''n''&thinsp;×&thinsp;''n''}} matrix. Recall that its '''{{mvar|r}}{{hairsp}}th higher adjugate matrix''' {{math|adj<sub>''r''{{hairsp}}</sub>(''A'')}} is the <math display="inline">\binom{n}{r} \!\times\! \binom{n}{r}</math> matrix whose {{math|(''I'', ''J''{{hairsp}})}} entry is
Comparing these and using the Sylvester-Franke theorem yields the identity
:<math>(-1)^{\sigma(I) + \sigma(J)} \det A_{J^c, I^c},</math>
where, for any set {{math|''K''}} of integers, {{math|''σ''(''K'')}} is the sum of the elements of {{math|''K''}}. The '''adjugate''' of {{math|''A''}} is its 1st higher adjugate and is denoted {{math|adj(''A'')}}. The generalized [[Laplace expansion]] formula implies
:<math>C_r(A)\operatorname{adj}_r(A) = \operatorname{adj}_r(A)C_r(A) = (\det A)I_{\binom{n}{r}}.</math>
 
If {{math|''A''}} is invertible, then
* <math>\operatorname{adj}(C_k(A))=\det(A)^{\binom{n-1}{k-1}-k} C_k(\operatorname{adj}(A))</math>
:<math>\operatorname{adj}_r(A^{-1}) = (\det A)^{-1}C_r(A).</math>
A concrete consequence of this is '''Jacobi's formula''' for the minors of an [[inverse matrix]]:
:<math>\det(A^{-1})_{J^c, I^c} = (-1)^{\sigma(I) + \sigma(J)} \frac{\det A_{I,J}}{\det A}.</math>
 
Adjugates can also be expressed in terms of compounds. Let {{math|''S''}} denote the ''sign matrix'':
<hr>
:<math>S = \operatorname{diag}(1, -1, 1, -1, \ldots, (-1)^{n-1}),</math>
and let {{math|''J''}} denote the ''[[exchange matrix]]'':
:<math>J = \begin{pmatrix} & & 1 \\ & \cdots & \\ 1 & & \end{pmatrix}.</math>
Then '''Jacobi's theorem''' states that the {{math|''r''}}{{hairsp}}th higher adjugate matrix is:<ref name="NambiarSreevalsan2001">{{cite journal|last1=Nambiar|first1=K.K.|last2=Sreevalsan|first2=S.|title=Compound matrices and three celebrated theorems|journal=Mathematical and Computer Modelling|volume=34|issue=3–4|year=2001|pages=251–255|issn=0895-7177|doi=10.1016/S0895-7177(01)00058-9|doi-access=free}}</ref><ref name="Price1947">{{cite journal|last1=Price|first1=G. B.|authorlink=G. B. Price|title=Some Identities in the Theory of Determinants|journal=The American Mathematical Monthly|volume=54|issue=2|year=1947|pages=75–90|issn=0002-9890|doi=10.2307/2304856|jstor=2304856}}</ref>
:<math>\operatorname{adj}_r(A) = JC_{n-r}(SAS)^TJ.</math>
 
It follows immediately from Jacobi's theorem that
====Jacobi's Theorem on the Adjugate====
:<math>C_r(A)\, J(C_{n-r}(SAS))^TJ = (\det A)I_{\binom{n}{r}}.</math>
Jacobi's Theorem extends ({{EquationNote|1}}) to higher-order minors
<ref name="NambiarSreevalsan2001">{{cite journal|last1=Nambiar|first1=K.K.|last2=Sreevalsan|first2=S.|title=Compound matrices and three celebrated theorems|journal=Mathematical and Computer Modelling|volume=34|issue=3-4|year=2001|pages=251–255|issn=08957177|doi=10.1016/S0895-7177(01)00058-9}}</ref>:
 
Taking adjugates and compounds does not commute. However, compounds of adjugates can be expressed using adjugates of compounds, and vice versa. From the identities
<center><math>C_k(\operatorname{adj}(A))=(\det A)^{k-1} J(C_{n-k}(SAS))^TJ</math></center>
:<math>C_r(C_s(A))C_r(\operatorname{adj}_s(A)) = (\det A)^rI,</math>
expressing minors of the adjugate in terms of complementary signed minors of the original
:<math>C_r(C_s(A))\operatorname{adj}_r(C_s(A)) = (\det C_s(A))I,</math>
matrix.
and the Sylvester-Franke theorem, we deduce
:<math>\operatorname{adj}_r(C_s(A)) = (\det A)^{\binom{n-1}{s-1}-r} C_r(\operatorname{adj}_s(A)).</math>
The same technique leads to an additional identity,
:<math>\operatorname{adj}(C_r(A)) = (\det A)^{\binom{n-1}{r-1}-r} C_r(\operatorname{adj}(A)).</math>
 
Compound and adjugate matrices appear when computing determinants of [[linear combination]]s of matrices. It is elementary to check that if {{math|''A''}} and {{math|''B''}} are {{math|''n''&thinsp;×&thinsp;''n''}} matrices then
Substituting into the previous identity and going back to ({{EquationNote|2}}) yields
:<math>\det(sA + tB) = C_n\!\left(\begin{bmatrix} sA & I_n \end{bmatrix}\right)C_n\!\left(\begin{bmatrix} I_n \\ tB \end{bmatrix}\right).</math>
 
It is also true that:<ref>{{Cite journal|last1=Prells|first1=Uwe|last2=Friswell|first2=Michael I.|last3=Garvey|first3=Seamus D.|date=2003-02-08|title=Use of geometric algebra: compound matrices and the determinant of the sum of two matrices|url=http://rspa.royalsocietypublishing.org/content/459/2030/273|journal=Proceedings of the Royal Society of London A: Mathematical, Physical and Engineering Sciences|language=en|volume=459|issue=2030|pages=273–285|doi=10.1098/rspa.2002.1040|bibcode=2003RSPSA.459..273P |s2cid=73593788 |issn=1364-5021|url-access=subscription}}</ref><ref>Horn and Johnson, p. 29</ref>
<math>C_k(A)\, J(C_{n-k}(SAS))^TJ=\det(A) I</math>
:<math>\det(sA + tB) = \sum_{r=0}^n s^r t^{n-r} \operatorname{tr}(\operatorname{adj}_r(A)C_r(B)).</math>
and hence the formula for the inverse of the compound matrix given above.
This has the immediate consequence
 
:<math>\det(I + A) = \sum_{r=0}^n \operatorname{tr} \operatorname{adj}_r(A) = \sum_{r=0}^n \operatorname{tr} C_r(A).</math>
== Applications ==
The computation of compound matrices appears in a wide array of problems.<ref>{{cite techreport|first=Boutin|last=D.L.|author2=R.F. Gleeson|author3=R.M. Williams|title=Wedge Theory / Compound Matrices: Properties and Applications.|institution=Office of Naval Research|url=http://handle.dtic.mil/100.2/ADA320264|year=1996|number=NAWCADPAX–96-220-TR}}</ref>
 
For instance, if <math>A</math> is viewed as the matrix of an operator in a [[Basis (linear algebra)|basis]] <math>(e_1,\dots,e_n)</math> then the compound matrix <math>C_k(A)</math> is the matrix of the <math>k</math>-th [[Exterior algebra|exterior power]] <math>A^{\wedge k}</math> in the basis <math>(e_{i_1} \wedge \dots \wedge e_{i_k})_{i_1 < \dots < i_k}</math>. In this formulation, the multiplicativity property <math>C_k(AB) = C_k(A)C_k(B)</math> is equivalent to the [[Exterior algebra#Functoriality|functoriality]] of the exterior power.<ref>Joseph P.S. Kung, Gian-Carlo Rota, and [[Catherine Yan|Catherine H. Yan]], ''Combinatorics: the Rota way'', Cambridge University Press, 2009, p. 306. {{isbn|9780521883894}}</ref>
 
Compound matrices also appears in the determinant of the sum of two matrices, as the following identity is valid:<ref>{{Cite journal|last=Prells|first=Uwe|last2=Friswell|first2=Michael I.|last3=Garvey|first3=Seamus D.|date=2003-02-08|title=Use of geometric algebra: compound matrices and the determinant of the sum of two matrices|url=http://rspa.royalsocietypublishing.org/content/459/2030/273|journal=Proceedings of the Royal Society of London A: Mathematical, Physical and Engineering Sciences|language=en|volume=459|issue=2030|pages=273–285|doi=10.1098/rspa.2002.1040|issn=1364-5021}}</ref><blockquote><math>\det(A+B)=\det\left(\begin{bmatrix} A & I_n \end{bmatrix} \begin{bmatrix} I_n \\ B \end{bmatrix}\right)
= C_n(\begin{bmatrix} A & I_n \end{bmatrix}) C_n\left( \begin{bmatrix} I_n \\ B \end{bmatrix} \right)</math></blockquote>
 
== Numerical computation ==
In general, the computation of compound matrices is non effectiveinefficient due to its high complexity. Nonetheless, there isare some efficient algorithms available for real matrices with special structuresstructure.<ref>{{Cite journal|lastlast1=Kravvaritis|firstfirst1=Christos|last2=Mitrouli|first2=Marilena|date=2009-02-01|title=Compound matrices: properties, numerical issues and analytical computations|url=http://users.uoa.gr/~mmitroul/mmitroulweb/numalg09.pdf|journal=Numerical Algorithms|language=en|volume=50|issue=2|pages=155|doi=10.1007/s11075-008-9222-7|s2cid=16067358 |issn=1017-1398}}</ref>
 
==Notes==
{{notelist}}
 
== Citations ==
{{reflist}}
 
==References==
* Gantmacher, F. R. and Krein, M. G., ''Oscillation Matrices and Kernels and Small Vibrations of Mechanical Systems'', Revised Edition. American Mathematical Society, 2002. {{isbn|978-0-8218-3171-7}}
 
[[Category:Matrices]]
[[Category:Matrices (mathematics)]]