Content deleted Content added
m Open access bot: doi added to citation with #oabot. |
Joel Brennan (talk | contribs) m added wikilinks |
||
Line 1:
In [[linear algebra]], a branch of [[mathematics]], a ('''multiplicative''') '''compound matrix''' is a [[matrix (mathematics)|matrix]] whose entries are all [[minor (linear algebra)|minors]], of a given size, of another matrix.<ref>Horn, Roger A. and Johnson, Charles R., ''Matrix Analysis'', 2nd edition, Cambridge University Press, 2013, {{isbn|978-0-521-54823-6}}, p. 21</ref><ref name=":0">{{Cite journal|last=Muldowney|first=James S.|date=1990|title=Compound matrices and ordinary differential equations|url=http://projecteuclid.org/euclid.rmjm/1181073047|journal=Rocky Mountain Journal of Mathematics|language=en|volume=20|issue=4|pages=857–872|doi=10.1216/rmjm/1181073047|issn=0035-7596|via=|doi-access=free}}</ref> Compound matrices are closely related to [[exterior algebra]]s.
== Definition ==
Let {{math|''A''}} be an {{math|''m''
The '''''r'' th compound matrix''' of {{math|''A''}} is a matrix, denoted {{math|''C''<sub>''r'' </sub>(''A'')}}, is defined as follows. If {{math|''r'' > min(''m'', ''n'')}}, then {{math|''C''<sub>''r'' </sub>(''A'')}} is the unique {{math|0
In some applications of compound matrices, the precise ordering of the rows and columns is unimportant. For this reason, some authors do not specify how the rows and columns are to be ordered.<ref>Kung, Rota, and Yan, p. 305.</ref>
Line 11:
For example, consider the matrix
:<math>A = \begin{pmatrix} 1 & 2 & 3 & 4 \\ 5 & 6 & 7 & 8 \\ 9 & 10 & 11 & 12 \end{pmatrix}.</math>
The rows are indexed by {{math|{1, 2, 3<nowiki>}</nowiki>}} and the columns by {{math|{1, 2, 3, 4<nowiki>}</nowiki>}}. Therefore, the rows of {{math|''C''<sub>2 </sub>(''A'')}} are indexed by the sets
:<math>\{1, 2\} < \{1, 3\} < \{2, 3\}</math>
and the columns are indexed by
:<math>\{1, 2\} < \{1, 3\} < \{1, 4\} < \{2, 3\} < \{2, 4\} < \{3, 4\}.</math>
Using absolute value bars to denote
:<math>\begin{align}
C_2(A)
Line 47:
==Properties==
Let {{math|''c''}} be a scalar, {{math|''A''}} be an {{math|''m''
* {{math|1=''C''<sub>0 </sub>(''A'') = ''I''<sub>1</sub>}}, a {{math|1
* {{math|1=''C''<sub>1</sub>(''A'') = ''A''}}.
* {{math|1=''C''<sub>''r'' </sub>(''cA'') = ''c''{{i sup|''r''}}''C''<sub>''r'' </sub>(''A'')}}.
* If {{math|1=rk ''A'' = ''r''}}, then {{math|1=rk C<sub>''r'' </sub>(''A'') = 1}}.
* If {{math|1
* If {{math|1
* If {{math|1
* {{math|1=''C''<sub>''r'' </sub>(''AB'') = ''C''<sub>''r'' </sub>(''A'') ''C''<sub>''r'' </sub>(''B'')}}.
* ([[Cauchy–Binet formula]]) {{math|1=det ''C''<sub>''r'' </sub>(''AB'') = (det ''C''<sub>''r'' </sub>(''A'')) (det ''C''<sub>''r'' </sub>(''B'')
Assume in addition that {{math|''A''}} is a [[square matrix]] of size {{math|''n''}}. Then:<ref>Horn and Johnson, pp. 22, 93, 147, 233.</ref>
* {{math|1=''C''<sub>''n'' </sub>(''A'') = det
* If {{math|''A''}} has one of the following properties, then so does {{math|''C''<sub>''r'' </sub>(''A'')}}:
** [[Upper triangular]],
** [[Lower triangular]],
** [[Diagonal matrix|Diagonal]],
** [[Orthogonal matrix|Orthogonal]],
** [[Unitary matrix|Unitary]],
** [[Symmetric matrix|Symmetric]],
** [[Hermitian matrix|Hermitian]],
** [[Skew-symmetric matrix|Skew-symmetric]],
** [[Skew-hermitian]],
** [[Positive definite matrix|Positive definite]],
** [[Positive semi-definite matrix|Positive semi-definite]],
** [[Normal matrix|Normal]].
* If {{math|''A''}} is [[invertible matrix|invertible]], then so is {{math|''C''<sub>''r'' </sub>(''A'')}}, and {{math|1=''C''<sub>''r'' </sub>(''A''{{i sup|−1}}) = ''C''<sub>''r'' </sub>(''A''){{i sup|
* (Sylvester–Franke theorem) If {{math|1
==Relation to exterior powers==
{{see also|Exterior algebra}}
Give {{math|'''R'''<sup>''n''</sup>}} the [[canonical basis|standard coordinate basis]] {{math|'''e'''<sub>1</sub>, ..., '''e'''<sub>''n''</sub>}}. The {{math|''r''}} th exterior power of {{math|'''R'''<sup>''n''</sup>}} is the [[vector space]]
:<math>\wedge^r \mathbf{R}^n</math>
whose [[basis (linear algebra)|basis]] consists of the formal symbols
:<math>\mathbf{e}_{i_1} \wedge \dots \wedge \mathbf{e}_{i_r},</math>
where
:<math>i_1 < \dots < i_r.</math>
Suppose that {{math|''A''}}
:<math>A \colon \mathbf{R}^n \to \mathbf{R}^m.</math>
Taking the {{math|''r''}} th exterior power of this linear transformation determines a linear transformation
:<math>\wedge^r A \colon \wedge^r \mathbf{R}^n \to \wedge^r \mathbf{R}^m.</math>
The matrix corresponding to this linear transformation (with respect to the above bases of the exterior powers) is {{math|''C''<sub>''r'' </sub>(''A'')}}. Taking exterior powers is a [[functor]], which means that<ref>Joseph P.S. Kung, Gian-Carlo Rota, and [[Catherine Yan|Catherine H. Yan]], ''[[Combinatorics: The Rota Way]]'', Cambridge University Press, 2009, p. 306. {{isbn|9780521883894}}</ref>
:<math>\wedge^r (AB) = (\wedge^r A)(\wedge^r B).</math>
This corresponds to the formula {{math|1=''C''<sub>''r'' </sub>(''AB'') = ''C''<sub>''r'' </sub>(''A'')''C''<sub>''r'' </sub>(''B'')}}. It is closely related to, and is a strengthening of, the [[Cauchy–Binet formula]].
==Relation to adjugate matrices==
{{see also|Adjugate matrix}}
Let {{math|''A''}} be an {{math|''n''
:<math>(-1)^{\sigma(I) + \sigma(J)} \det A_{J^c, I^c},</math>
where, for any set {{math|''K''}} of integers, {{math|''σ''(''K'')}} is the sum of the elements of {{math|''K''}}. The '''adjugate''' of {{math|''A''}} is its 1st higher adjugate and is denoted {{math|adj(''A'')}}. The generalized [[Laplace expansion]] formula implies
Line 106:
If {{math|''A''}} is invertible, then
:<math>\operatorname{adj}_r(A^{-1}) = (\det A)^{-1}C_r(A).</math>
A concrete consequence of this is '''Jacobi's formula''' for the minors of an [[inverse matrix]]:
:<math>\det(A^{-1})_{J^c, I^c} = (-1)^{\sigma(I) + \sigma(J)} \frac{\det A_{I,J}}{\det A}.</math>
Line 113:
and let {{math|''J''}} denote the ''[[exchange matrix]]'':
:<math>J = \begin{pmatrix} & & 1 \\ & \cdots & \\ 1 & & \end{pmatrix}.</math>
Then '''Jacobi's theorem''' states that the {{math|''r''}} th higher adjugate matrix is:<ref name="NambiarSreevalsan2001">{{cite journal|last1=Nambiar|first1=K.K.|last2=Sreevalsan|first2=S.|title=Compound matrices and three celebrated theorems|journal=Mathematical and Computer Modelling|volume=34|issue=3–4|year=2001|pages=251–255|issn=0895-7177|doi=10.1016/S0895-7177(01)00058-9|doi-access=free}}</ref><ref name="Price1947">{{cite journal|last1=Price|first1=G. B.|authorlink=G. B. Price|title=Some Identities in the Theory of Determinants|journal=The American Mathematical Monthly|volume=54|issue=2|year=1947|pages=75–90|issn=0002-9890|doi=10.2307/2304856|jstor=2304856}}</ref>
:<math>\operatorname{adj}_r(A) = JC_{n-r}(SAS)^TJ.</math>
Line 130:
The computation of compound matrices appears in a wide array of problems.<ref>{{cite techreport|first=Boutin|last=D.L.|author2=R.F. Gleeson|author3=R.M. Williams|title=Wedge Theory / Compound Matrices: Properties and Applications.|institution=Office of Naval Research|url=https://apps.dtic.mil/sti/pdfs/ADA320264.pdf|year=1996|number=NAWCADPAX–96-220-TR}}</ref><ref name=":0" />
Compound and adjugate matrices appear when computing determinants of [[linear
:<math>\det(sA + tB) = C_n\!\left(\begin{bmatrix} sA & I_n \end{bmatrix}\right)C_n\!\left(\begin{bmatrix} I_n \\ tB \end{bmatrix}\right).</math>
It is also true that:<ref>{{Cite journal|last=Prells|first=Uwe|last2=Friswell|first2=Michael I.|last3=Garvey|first3=Seamus D.|date=2003-02-08|title=Use of geometric algebra: compound matrices and the determinant of the sum of two matrices|url=http://rspa.royalsocietypublishing.org/content/459/2030/273|journal=Proceedings of the Royal Society of London A: Mathematical, Physical and Engineering Sciences|language=en|volume=459|issue=2030|pages=273–285|doi=10.1098/rspa.2002.1040|issn=1364-5021}}</ref><ref>Horn and Johnson, p. 29</ref>
:<math>\det(sA + tB) = \sum_{r=0}^n s^r t^{n-r} \operatorname{tr}(\operatorname{adj}_r(A)C_r(B)).</math>
Line 138:
== Numerical computation ==
In general, the computation of compound matrices is non
==Notes==
|