Content deleted Content added
→Properties: Removed vague and wrong claims in second paragraph, retained the corrected claim. |
m Open access bot: url-access updated in citation with #oabot. |
||
(56 intermediate revisions by 27 users not shown) | |||
Line 1:
{{Short description|Mathematical concept in algebra}}
In [[linear algebra]], two [[matrix (mathematics)|matrices]] <math>A</math> and <math>B</math> are said to '''commute''' if <math>AB=BA</math>,
A [[set (mathematics)|set]] of matrices <math>A_1, \ldots, A_k</math> is said to '''commute''' if they commute pairwise, meaning that every pair of matrices in the set commutes.
Commuting matrices over an algebraically closed field are [[simultaneously triangularizable]], in other words they will be both upper triangular on a same basis. This follows from the fact that commuting matrices preserve each others eigenspaces. If both matrices are diagonalizable, then they can be simultaneously diagonalized. Moreover, if one of the matrices has the property that its minimal polynomial coincides with its characteristic polynomial (i.e., it has the maximal degree), which happens in particular whenever the characteristic polynomial has only simple roots, then the other matrix can be written as a polynomial of the first.▼
== Characterizations and properties ==
As a direct consequence of simultaneous triangulizability, the eigenvalues of two commuting matrices complex ''A'', ''B'' with their algebraic multiplicities (the [[multiset]]s of roots of their characteristic polynomials) can be matched up as <math>\alpha_i\leftrightarrow\beta_i</math> in such a way that the multiset of eigenvalues of any polynomial <math>P(A,B)</math> in the two matrices is the multiset of the values <math>P(\alpha_i,\beta_i)</math>.▼
* Commuting matrices preserve each other's [[eigenspace]]s.<ref>{{Cite book|title=Matrix Analysis|last=Horn|first=Roger A.|last2=Johnson|first2=Charles R.|publisher=Cambridge University Press|year=2012|isbn=9780521839402|pages=70}}</ref> As a consequence, commuting matrices over an [[algebraically closed field]] are [[simultaneously triangularizable]]; that is, there are [[basis (linear algebra)|bases]] over which they are both [[Upper triangular matrix|upper triangular]]. In other words, if <math>A_1,\ldots,A_k</math> commute, there exists a similarity matrix <math>P</math> such that <math>P^{-1} A_i P</math> is upper triangular for all <math>i \in \{1,\ldots,k\}</math>. The [[converse (logic)|converse]] is not necessarily true, as the following counterexample shows:
*:<math>\begin{bmatrix} 1 & 2 \\ 0 & 3 \end{bmatrix}\begin{bmatrix} 1 & 1 \\ 0 & 1 \end{bmatrix} = \begin{bmatrix} 1 & 3 \\ 0 & 3 \end{bmatrix} \ne \begin{bmatrix} 1 & 5 \\ 0 & 3 \end{bmatrix}=\begin{bmatrix} 1 & 1 \\ 0 & 1 \end{bmatrix}\begin{bmatrix} 1 & 2 \\ 0 & 3 \end{bmatrix}.</math>
[[Lie's theorem]], which shows that any representation of a [[solvable Lie algebra]] is simultaneously upper triangularizable may be viewed as a generalization.▼
: However, if the square of the commutator of two matrices is zero, that is, <math>[A,B]^2 = 0</math>, then the converse is true.<ref>{{Cite book|title=Matrix Analysis|last=Horn|first=Roger A.|last2=Johnson|first2=Charles R.|publisher=Cambridge University Press|year=2012|isbn=9780521839402|pages=127}}</ref>
* Two diagonalizable matrices <math>A</math> and <math>B</math> commute (<math>AB=BA</math>) if they are [[simultaneously diagonalizable]] (that is, there exists an invertible matrix <math>P</math> such that both <math>P^{-1} A P</math> and <math>P^{-1}B P</math> are [[diagonal matrix|diagonal]]).<ref name="HornJohnson">{{cite book|title=Matrix Analysis, second edition|last1=Horn|first1=Roger A.|last2=Johnson|first2=Charles R.|publisher=Cambridge University Press|year=2013|isbn=9780521839402}}</ref>{{rp|p. 64}} The converse is also true; that is, if two diagonalizable matrices commute, they are simultaneously diagonalizable.<ref>[[Without loss of generality]], one may suppose that the first matrix <math>A=(a_{i,j})</math> is diagonal. In this case, commutativity implies that if an entry <math>b_{i,j}</math> of the second matrix is nonzero, then <math>a_{i,i}=a_{j,j}.</math> After a permutation of rows and columns, the two matrices become simultaneously [[block diagonal]]. In each block, the first matrix is the product of an identity matrix, and the second one is a diagonalizable matrix. So, diagonalizing the blocks of the second matrix does change the first matrix, and allows a simultaneous diagonalization.</ref> But if you take any two matrices that commute (and do not assume they are two diagonalizable matrices) they are simultaneously diagonalizable already if one of the matrices has no multiple eigenvalues.<ref>{{Cite web |title=Proofs Homework Set 10 MATH 217 — WINTER 2011 |url=http://www.math.lsa.umich.edu/~tfylam/Math217/proofs10-sol.pdf |access-date=10 July 2022}}</ref>
* If <math>A</math> and <math>B</math> commute, they have a common eigenvector. If <math>A</math> has distinct eigenvalues, and <math>A</math> and <math>B</math> commute, then <math>A</math>'s eigenvectors are <math>B</math>'s eigenvectors.
▲
▲* As a direct consequence of simultaneous triangulizability, the
* Two [[Hermitian matrix|Hermitian]] matrices commute if their [[eigenspace]]s coincide. In particular, two Hermitian matrices without multiple eigenvalues commute if they share the same set of eigenvectors. This follows by considering the eigenvalue decompositions of both matrices. Let <math>A</math> and <math>B</math> be two Hermitian matrices. <math>A</math> and <math>B</math> have common eigenspaces when they can be written as <math>A = U \Lambda_1 U^\dagger</math> and <math>B = U \Lambda_2 U^\dagger</math>. It then follows that
*: <math>AB = U \Lambda_1 U^\dagger U \Lambda_2 U^\dagger = U \Lambda_1 \Lambda_2 U^\dagger = U \Lambda_2 \Lambda_1 U^\dagger = U \Lambda_2 U^\dagger U \Lambda_1 U^\dagger = BA.</math>
* The property of two matrices commuting is not [[transitive relation|transitive]]: A matrix <math>A</math> may commute with both <math>B</math> and <math>C</math>, and still <math>B</math> and <math>C</math> do not commute with each other. As an example, the [[identity matrix]] commutes with all matrices, which between them do not all commute. If the set of matrices considered is restricted to Hermitian matrices without multiple eigenvalues, then commutativity is transitive, as a consequence of the characterization in terms of eigenvectors.
▲* [[Lie's theorem]], which shows that any [[Lie algebra representation|representation]] of a [[solvable Lie algebra]] is simultaneously upper triangularizable may be viewed as a generalization.
* An ''n'' × ''n'' matrix <math>A</math> commutes with every other ''n'' × ''n'' matrix if and only if it is a scalar matrix, that is, a matrix of the form <math>\lambda I</math>, where <math>I</math> is the ''n'' × ''n'' identity matrix and <math>\lambda</math> is a scalar. In other words, the [[center (group theory)|center]] of the [[group (mathematics)|group]] of ''n'' × ''n'' matrices under multiplication is the [[subgroup]] of scalar matrices.
* Fix a finite field <math>\mathbb F_q</math>, let <math>P(n)</math> denote the number of ordered pairs of commuting <math>n\times n</math> matrices over <math>\mathbb F_q</math>, [[Walter Feit|W. Feit]] and N. J. Fine<ref>{{Cite journal |last=Feit |first=Walter |last2=Fine |first2=N. J. |date=1960-03-01 |title=Pairs of commuting matrices over a finite field |url=http://dx.doi.org/10.1215/s0012-7094-60-02709-5 |journal=Duke Mathematical Journal |volume=27 |issue=1 |doi=10.1215/s0012-7094-60-02709-5 |issn=0012-7094|url-access=subscription }}</ref> showed the equation<math display="block">1 + \sum_{n=1}^\infty \frac{P(n)}{(q^n-1)(q^n-q)\cdots (q^n-q^{n-1})} z^n =
\prod_{i=1}^\infty \prod_{j=0}^\infty \frac {1}{1 - q^{1-j} z^i}.</math>
==Examples==
* The identity matrix commutes with all matrices.
* [[Jordan
* If the product of two [[symmetric matrix|symmetric matrices]] is symmetric, then they must commute. That also means that every diagonal matrix commutes with all other diagonal matrices.<ref>{{cite web |title=Do Diagonal Matrices Always Commute? |url=https://math.stackexchange.com/q/1697991 |author=<!--Not stated--> |date=March 15, 2016 |publisher=Stack Exchange |access-date=August 4, 2018 }}
</ref><ref>{{Cite web |title=Linear Algebra WebNotes part 2 |url=https://math.vanderbilt.edu/sapirmv/msapir/jan22.html |access-date=2022-07-10 |website=math.vanderbilt.edu}}</ref>
* [[Circulant matrices]] commute. They form a [[commutative ring]] since the sum of two circulant matrices is circulant.
== History ==
The notion of commuting matrices was introduced by [[Arthur Cayley|Cayley]] in his memoir on the theory of matrices, which also provided the first axiomatization of matrices. The first significant results
== References ==
|