Symmetric matrix: Difference between revisions

Content deleted Content added
Basic properties: Redundant clause* AB symmetry operated on by the diagonal...
Tags: Mobile edit Mobile web edit
 
(47 intermediate revisions by 39 users not shown)
Line 1:
{{Use American English|date=January 2019}}{{Short description|Matrix equal to its transpose
}}
{{about|a matrix symmetric about its diagonal|a matrix symmetric about its center| Centrosymmetric matrix}}
{{forFor|matrices with symmetry over the [[complex number]] field|Hermitian matrix}}
 
{{Use American English|date=January 2019}}
[[File:Matrix symmetry qtl1.svg|thumb|Symmetry of a 5×5 matrix]]
 
Line 17 ⟶ 18:
Because equal matrices have equal dimensions, only square matrices can be symmetric.
 
The entries of a symmetric matrix are symmetric with respect to the [[main diagonal]]. So if <math>a_{ij}</math> denotes the entry in the <math>i</math>-th row and <math>j</math>-th column then
 
{{Equation box 1
Line 29 ⟶ 30:
for all indices <math>i</math> and <math>j.</math>
 
Every square [[diagonal matrix]] is symmetric, since all off-diagonal elements are zero. Similarly in [[characteristic (algebra)|characteristic]] different from 2, each diagonal element of a [[skew-symmetric matrix]] must be zero, since each is its own negative.
Every square [[diagonal matrix]] is
symmetric, since all off-diagonal elements are zero. Similarly in [[characteristic (algebra)|characteristic]] different from 2, each diagonal element of a [[skew-symmetric matrix]] must be zero, since each is its own negative.
 
In linear algebra, a [[real number|real]] symmetric matrix represents a [[self-adjoint operator]]<ref>{{Cite book|author=Jesús Rojo García|title=Álgebra lineal |language= es|edition=2nd|publisher=Editorial AC|year=1986|isbn=84-7288-120-2}}</ref> represented in an [[orthonormal basis]] over a [[real number|real]] [[inner product space]]. The corresponding object for a [[complex number|complex]] inner product space is a [[Hermitian matrix]] with complex-valued entries, which is equal to its [[conjugate transpose]]. Therefore, in linear algebra over the complex numbers, it is often assumed that a symmetric matrix refers to one which has real-valued entries. Symmetric matrices appear naturally in a variety of applications, and typical numerical linear algebra software makes special accommodations for them.
 
== Example ==
The following <math>3 \times 3</math> matrix is symmetric:
<math display="block">A =
 
:<math>A =
\begin{bmatrix}
1 & 7 & 3 \\
7 & 4 & -5 \\
3 & 5 & 62
\end{bmatrix}</math>
Since <math>A=A^\textsf{T}</math>.
 
== Properties ==
===Basic properties===
* The sum and difference of two symmetric matrices is again symmetric.
 
* This is not always true for the [[matrix multiplication|product]]: given symmetric matrices <math>A</math> and <math>B</math>, then <math>AB</math> is symmetric if and only if <math>A</math> and <math>B</math> [[commutativity|commute]], i.e., if <math>AB=BA</math>.
* IfFor any integer <math>n</math>, <math>A^{-1}n</math> exists, it is symmetric if and only if <math>A</math> is symmetric.
 
* ForRank integerof <math>n</math>,a symmetric matrix <math>A^n</math> is symmetricequal ifto the number of non-zero eigenvalues of <math>A</math> is symmetric.
 
* If <math>A^{-1}</math> exists, it is symmetric if and only if <math>A</math> is symmetric.
 
===Decomposition into symmetric and skew-symmetric===
Any square matrix can uniquely be written as sum of a symmetric and a skew-symmetric matrix. This decomposition is known as the Toeplitz decomposition. Let <math>\mbox{Mat}_n</math> denote the space of <math>n \times n</math> matrices. If <math>\mbox{Sym}_n</math> denotes the space of <math>n \times n</math> symmetric matrices and <math>\mbox{Skew}_n</math> the space of <math>n \times n</math> skew-symmetric matrices then <math>\mbox{Mat}_n = \mbox{Sym}_n + \mbox{Skew}_n</math> and <math>\mbox{Sym}_n \cap \mbox{Skew}_n = \{0\}</math>, i.e.
:<math display="block">\mbox{Mat}_n = \mbox{Sym}_n \oplus \mbox{Skew}_n , </math>
Let <math>\mbox{Mat}_n</math> denote the space of <math>n \times n</math> matrices. If <math>\mbox{Sym}_n</math> denotes the space of <math>n \times n</math> symmetric matrices and <math>\mbox{Skew}_n</math> the space of <math>n \times n</math> skew-symmetric matrices then <math>\mbox{Mat}_n = \mbox{Sym}_n + \mbox{Skew}_n</math> and <math>\mbox{Sym}_n \cap \mbox{Skew}_n = \{0\}</math>, i.e.
:<math>\mbox{Mat}_n = \mbox{Sym}_n \oplus \mbox{Skew}_n , </math>
 
where <math>\oplus</math> denotes the [[direct sum of modules|direct sum]]. Let <math>X \in \mbox{Mat}_n</math> then
:<math display="block">X = \frac{1}{2}\left(X + X^\textsf{T}\right) + \frac{1}{2}\left(X - X^\textsf{T}\right).</math>.
 
Notice that <math display="inline">\frac{1}{2}\left(X + X^\textsf{T}\right) \in \mbox{Sym}_n</math> and <math display="inline">\frac{1}{2} \left(X - X^\textsf{T}\right) \in \mboxmathrm{Skew}_n</math>. This is true for every [[square matrix]] <math>X</math> with entries from any [[field (mathematics)|field]] whose [[characteristic (algebra)|characteristic]] is different from 2.
 
A symmetric <math>n \times n</math> matrix is determined by <math>\tfrac{1}{2}n(n+1)</math> scalars (the number of entries on or above the [[main diagonal]]). Similarly, a [[skew-symmetric matrix]] is determined by <math>\tfrac{1}{2}n(n-1)</math> scalars (the number of entries above the main diagonal).
 
=== Matrix congruent to a symmetric matrix ===
Any matrix [[matrix congruence|congruent]] to a symmetric matrix is again symmetric: if <math>X</math> is a symmetric matrix, then so is <math>A X A^{\mathrm T}</math> for any matrix <math>A</math>.
 
=== Symmetry implies normality ===
Line 76 ⟶ 70:
<!--If A is a skew-symmetric matrix, then ''iA'' (where ''i'' is an [[imaginary unit]]) is symmetric.-->
Denote by <math>\langle \cdot,\cdot \rangle</math> the standard [[inner product]] on <math>\mathbb{R}^n</math>. The real <math>n \times n</math> matrix <math>A</math> is symmetric if and only if
:<math display="block">\langle Ax, y \rangle = \langle x, Ay \rangle \quad \forall x, y \in \mathbb{R}^n.</math>
 
:<math>\langle Ax, y \rangle = \langle x, Ay \rangle \quad \forall x, y \in \mathbb{R}^n.</math>
 
Since this definition is independent of the choice of [[basis (linear algebra)|basis]], symmetry is a property that depends only on the [[linear operator]] A and a choice of [[inner product]]. This characterization of symmetry is useful, for example, in [[differential geometry]], for each [[tangent space]] to a [[manifold]] may be endowed with an inner product, giving rise to what is called a [[Riemannian manifold]]. Another area where this formulation is used is in [[Hilbert space]]s.
 
The finite-dimensional [[spectral theorem]] says that any symmetric matrix whose entries are [[real number|real]] can be [[diagonal matrix|diagonalized]] by an [[orthogonal matrix]]. More explicitly: For every symmetric real symmetric matrix <math>A</math> there exists a real orthogonal matrix <math>Q</math> such that <math>D = Q^{\mathrm T} A Q</math> is a [[diagonal matrix]]. Every real symmetric matrix is thus, [[up to]] choice of an [[orthonormal basis]], a diagonal matrix.
 
If <math>A</math> and <math>B</math> are <math>n \times n</math> real symmetric matrices that commute, then they can be simultaneously diagonalized by an orthogonal matrix:<ref>{{Cite book|first=Richard |last=Bellman|title=Introduction to Matrix Analysis |language= en|edition=2nd|publisher=SIAM|year=1997|isbn=08-9871-399-4}}</ref> there exists a basis of <math>\mathbb{R}^n</math> such that every element of the basis is an [[eigenvector]] for both <math>A</math> and <math>B</math>.
 
Every real symmetric matrix is [[Hermitian matrix|Hermitian]], and therefore all its [[eigenvalues]] are real. (In fact, the eigenvalues are the entries in the diagonal matrix <math>D</math> (above), and therefore <math>D</math> is uniquely determined by <math>A</math> up to the order of its entries.) Essentially, the property of being symmetric for real matrices corresponds to the property of being Hermitian for complex matrices.
 
=== Complex symmetric matrices {{anchor|Complex}}===
A complex symmetric matrix can be 'diagonalized' using a [[unitary matrix]]: thus if <math>A</math> is a complex symmetric matrix, there is a unitary matrix <math>U</math> such that <math>U A U^{\mathrm T}</math> is a real diagonal matrix with non-negative entries. This result is referred to as the '''Autonne–Takagi factorization'''. It was originally proved by [[Léon Autonne]] (1915) and [[Teiji Takagi]] (1925) and rediscovered with different proofs by several other mathematicians.<ref>{{cite bookharvnb|first1=R.A.|last1=Horn|first2=C.R.|last2=Johnson|title=Matrix analysis |year=2013 | editionpp=2nd | publisher = Cambridge University Press | mr = 2978290|at=pp. 263, 278}}</ref><ref>See:
*{{citation|first=L.|last= Autonne|title= Sur les matrices hypohermitiennes et sur les matrices unitaires|journal= Ann. Univ. Lyon|volume= 38|year=1915|pages= 1–77|url=https://gallica.bnf.fr/ark:/12148/bpt6k69553b}}
*{{citation|first=T.|last= Takagi|title= On an algebraic problem related to an analytic theorem of Carathéodory and Fejér and on an allied theorem of Landau|journal= Jpn. J. Math.|volume= 1 |year=1925|pages= 83–93|doi= 10.4099/jjm1924.1.0_83|doi-access= free}}
*{{citation|title=Symplectic Geometry|first=Carl Ludwig|last= Siegel|journal= American Journal of Mathematics|volume= 65|issue=1 |year=1943|pages=1–86|jstor= 2371774|doi=10.2307/2371774|id=Lemma 1, page 12}}
*{{citation|first=L.-K.|last= Hua|title= On the theory of automorphic functions of a matrix variable I–geometric basis|journal= Amer. J. Math.|volume= 66 |issue= 3|year=1944|pages= 470–488|doi=10.2307/2371910|jstor= 2371910}}
*{{citation|first=I.|last= Schur|title= Ein Satz über quadratische formenFormen mit komplexen koeffizientenKoeffizienten|journal=Amer. J. Math. |volume=67 |issue= 4|year=1945|pages=472–480|doi=10.2307/2371974|jstor= 2371974}}
*{{citation|first1=R.|last1= Benedetti|first2=P.|last2= Cragnolini|title=On simultaneous diagonalization of one Hermitian and one symmetric form|journal= Linear Algebra Appl. |volume=57 |year=1984| pages=215–226|doi=10.1016/0024-3795(84)90189-7|doi-access=free}}
</ref> In fact, the matrix <math>B=A^{\dagger} A</math> is Hermitian and [[Definiteness of a matrix|positive semi-definite]], so there is a unitary matrix <math>V</math> such that <math>V^{\dagger} B V</math> is diagonal with non-negative real entries. Thus <math>C=V^{\mathrm T} A V</math> is complex symmetric with <math>C^{\dagger}C</math> real. Writing <math>C=X+iY</math> with <math>X</math> and <math>Y</math> real symmetric matrices, <math>C^{\dagger}C=X^2+Y^2+i(XY-YX)</math>. Thus <math>XY=YX</math>. Since <math>X</math> and <math>Y</math> commute, there is a real orthogonal matrix <math>W</math> such that both <math>W X W^{\mathrm T}</math> and <math>W Y W^{\mathrm T}</math> are diagonal. Setting <math>U=W V^{\mathrm T}</math> (a unitary matrix), the matrix <math>UAU^{\mathrm T}</math> is complex diagonal. Pre-multiplying <math>U</math> by a suitable diagonal unitary matrix (which preserves unitarity of <math>U</math>), the diagonal entries of <math>UAU^{\mathrm T}</math> can be made to be real and non-negative as desired. To construct this matrix, we express the diagonal matrix as <math>UAU^\mathrm T = \textrmoperatorname{Diagdiag}(r_1er_1 e^{i\theta_1},r_2er_2 e^{i\theta_2}, \dots, r_ner_n e^{i\theta_n})</math>. The matrix we seek is simply given by <math>D = \textrmoperatorname{Diagdiag}(e^{-i\theta_1/2},e^{-i\theta_2/2}, \dots, e^{-i\theta_n/2})</math>. Clearly <math>DUAU^\mathrm TD = \textrmoperatorname{Diagdiag}(r_1, r_2, \dots, r_n)</math> as desired, so we make the modification <math>U' = DU</math>. Since their squares are the eigenvalues of <math>A^{\dagger} A</math>, they coincide with the [[singular value]]s of <math>A</math>. (Note, about the eigen-decomposition of a complex symmetric matrix <math>A</math>, the Jordan normal form of <math>A</math> may not be diagonal, therefore <math>A</math> may not be diagonalized by any similarity transformation.)
 
== Decomposition ==
Line 103 ⟶ 96:
 
[[Cholesky decomposition]] states that every real positive-definite symmetric matrix <math>A</math> is a product of a lower-triangular matrix <math>L</math> and its transpose,
:<math display="block">A = LL^\textsf{T}.</math>.
 
If the matrix is symmetric indefinite, it may be still decomposed as <math>PAP^\textsf{T} = LDL^\textsf{T}</math> where <math>P</math> is a permutation matrix (arising from the need to [[pivot element|pivot]]), <math>L</math> a lower unit triangular matrix, and <math>D</math> {{Relevance inline|reason=not referred to in this section|date=December 2015}} is a direct sum of symmetric <math>1 \times 1</math> and <math>2 \times 2</math> blocks, which is called Bunch–Kaufman decomposition <ref>{{cite book | author-link1=G.Gene H. Golub, C|last1=Golub |first1=G.H. |author2-link=Charles F. Van Loan |last2=van Loan |first2=C.F. | title=Matrix Computations | publisher=The Johns Hopkins University Press, Baltimore, London | year=1996 |isbn=0-8018-5413-X |oclc=34515797 }}</ref>
 
A general (complex) symmetric matrix may not be diagonalizable[[defective bymatrix|defective]] similarity;and everythus realnot symmetricbe matrix[[diagonalizable]]. If <math>A</math> is diagonalizable byit amay realbe orthogonaldecomposed similarity.as
:<math display="block">A = Q \Lambda Q^\textsf{T}</math>
where <math>Q</math> is aan [[unitaryorthogonal matrix]]. If<math>Q AQ^\textsf{T} is= realI</math>, the matrixand <math>Q\Lambda</math> is a real [[orthogonaldiagonal matrix]], (the columns of whichthe are [[eigenvectors]]eigenvalues of <math>A</math>),. andIn the special case that <math>\LambdaA</math> is real andsymmetric, diagonalthen (having<math>Q</math> the [[eigenvalues]] ofand <math>A\Lambda</math> onare thealso diagonal)real. To see orthogonality, suppose <math>\mathbf x</math> and <math>\mathbf y</math> are eigenvectors corresponding to distinct eigenvalues <math>\lambda_1</math>, <math>\lambda_2</math>. Then
:<math display="block">\lambda_1 \langle \mathbf x, \mathbf y \rangle = \langle AxA \mathbf x, \mathbf y \rangle = \langle \mathbf x, AyA \mathbf y \rangle = \lambda_2 \langle \mathbf x, \mathbf y \rangle.</math>
 
Since <math>\lambda_1</math> and <math>\lambda_2</math> are distinct, we have <math>\langle \mathbf x, \mathbf y \rangle = 0</math>.
Every complex symmetric matrix <math>A</math> can be diagonalized by unitary congruence
:<math>A = Q \Lambda Q^\textsf{T}</math>
 
where <math>Q</math> is a [[unitary matrix]]. If A is real, the matrix <math>Q</math> is a real [[orthogonal matrix]], (the columns of which are [[eigenvectors]] of <math>A</math>), and <math>\Lambda</math> is real and diagonal (having the [[eigenvalues]] of <math>A</math> on the diagonal). To see orthogonality, suppose <math>x</math> and <math>y</math> are eigenvectors corresponding to distinct eigenvalues <math>\lambda_1</math>, <math>\lambda_2</math>. Then
:<math>\lambda_1 \langle x, y \rangle = \langle Ax, y \rangle = \langle x, Ay \rangle = \lambda_2 \langle x, y \rangle.</math>
 
Since <math>\lambda_1</math> and <math>\lambda_2</math> are distinct, we have <math>\langle x, y \rangle = 0</math>.
 
== Hessian ==
Symmetric <math>n \times n</math> matrices of real functions appear as the [[Hessian matrix|Hessians]] of twice continuously differentiable functions of <math>n</math> real variables (the continuity of the second derivative is not needed, despite common belief to the opposite<ref>{{Cite book |last=Dieudonné |first=Jean A. |title=Foundations of Modern Analysis |publisher=Academic Press |year=1969 |chapter=Theorem (8.12.2) |page=180 |isbn=0-12-215550-5 |oclc=576465}}</ref>).
 
Every [[quadratic form]] <math>q</math> on <math>\mathbb{R}^n</math> can be uniquely written in the form <math>q(\mathbf{x}) = \mathbf{x}^\textsf{T} A \mathbf{x}</math> with a symmetric <math>n \times n</math> matrix <math>A</math>. Because of the above spectral theorem, one can then say that every quadratic form, up to the choice of an orthonormal basis of <math>\mathbb{R}^n</math>, "looks like"
:<math>q\left(x_1, \ldots, x_n\right) = \sum_{i=1}^n \lambda_i x_i^2</math>
 
Every [[quadratic form]] <math>q</math> on <math>\mathbb{R}^n</math> can be uniquely written in the form <math>q(\mathbf{x}) = \mathbf{x}^\textsf{T} A \mathbf{x}</math> with a symmetric <math>n \times n</math> matrix <math>A</math>. Because of the above spectral theorem, one can then say that every quadratic form, up to the choice of an orthonormal basis of <math>\mathbb{R}^n</math>, "looks like"
:<math display="block">q\left(x_1, \ldots, x_n\right) = \sum_{i=1}^n \lambda_i x_i^2</math>
with real numbers <math>\lambda_i</math>. This considerably simplifies the study of quadratic forms, as well as the study of the level sets <math>\left\{ \mathbf{x} : q(\mathbf{x}) = 1 \right\}</math> which are generalizations of [[conic section]]s.
 
Line 138 ⟶ 127:
Other types of [[symmetry]] or pattern in square matrices have special names; see for example:
{{Div col|colwidth=25em}}
* [[Skew-symmetric matrix]] (also called ''antisymmetric'' or ''antimetric'')
* [[Antimetric matrix]]
* [[Centrosymmetric matrix]]
* [[Circulant matrix]]
* [[Covariance matrix]]
* [[Coxeter matrix]]
* [[AntimetricGCD matrix]]
* [[Hankel matrix]]
* [[Hilbert matrix]]
* [[Persymmetric matrix]]
* [[Skew-symmetric matrix]]
* [[Sylvester's law of inertia]]
* [[Toeplitz matrix]]
* [[Skew-symmetricTranspositions matrix]]
{{Div col end}}
 
Line 157 ⟶ 147:
 
== References ==
{{refbegin}}
*{{citation|lastlast1=Horn|firstfirst1= Roger A.|last2= Johnson|first2= Charles R.|title= Matrix analysis|edition=2nd| publisher=Cambridge University Press|year= 2013|isbn= 978-0-521-54823-6}}
{{refend}}
 
== External links ==
Line 164 ⟶ 156:
* [https://fylux.github.io/2017/03/07/Symmetric-Triangular-Matrix/ How to implement a Symmetric Matrix in C++]
{{Matrix classes}}
{{Authority control}}
[[Category:Matrices]]
 
[[Category:Matrices (mathematics)]]