Content deleted Content added
No edit summary |
|||
Line 56:
===Decomposition into symmetric and skew-symmetric===
Any square matrix can uniquely be written as sum of a symmetric and a skew-symmetric matrix. This decomposition is known as the Toeplitz decomposition. Let <math>\mbox{Mat}_n</math> denote the space of <math>n \times n</math> matrices. If <math>\mbox{Sym}_n</math> denotes the space of <math>n \times n</math> symmetric matrices and <math>\mbox{Skew}_n</math> the space of <math>n \times n</math> skew-symmetric matrices then <math>\mbox{Mat}_n = \mbox{Sym}_n + \mbox{Skew}_n</math> and <math>\mbox{Sym}_n \cap \mbox{Skew}_n = \{0\}</math>, i.e.
:<math>\mbox{Mat}_n = \mbox{Sym}_n \oplus \mbox{Skew}_n , </math>
Line 63 ⟶ 62:
:<math>X = \frac{1}{2}\left(X + X^\textsf{T}\right) + \frac{1}{2}\left(X - X^\textsf{T}\right)</math>.
Notice that <math display="inline">\frac{1}{2}\left(X + X^\textsf{T}\right) \in \mbox{Sym}_n</math> and <math display="inline">\frac{1}{2}\left(X - X^\textsf{T}\right) \in \
A symmetric <math>n \times n</math> matrix is determined by <math>\tfrac{1}{2}n(n+1)</math> scalars (the number of entries on or above the [[main diagonal]]). Similarly, a [[skew-symmetric matrix]] is determined by <math>\tfrac{1}{2}n(n-1)</math> scalars (the number of entries above the main diagonal).
Line 95 ⟶ 94:
*{{citation|first=I.|last= Schur|title= Ein Satz über quadratische formen mit komplexen koeffizienten|journal=Amer. J. Math.|volume=67|issue= 4|year=1945|pages=472–480|doi=10.2307/2371974|jstor= 2371974}}
*{{citation|first1=R.|last1= Benedetti|first2=P.|last2= Cragnolini|title=On simultaneous diagonalization of one Hermitian and one symmetric form|journal= Linear Algebra Appl. |volume=57 |year=1984| pages=215–226|doi=10.1016/0024-3795(84)90189-7|doi-access=free}}
</ref> In fact, the matrix <math>B=A^{\dagger} A</math> is Hermitian and [[Definiteness of a matrix|positive semi-definite]], so there is a unitary matrix <math>V</math> such that <math>V^{\dagger} B V</math> is diagonal with non-negative real entries. Thus <math>C=V^{\mathrm T} A V</math> is complex symmetric with <math>C^{\dagger}C</math> real. Writing <math>C=X+iY</math> with <math>X</math> and <math>Y</math> real symmetric matrices, <math>C^{\dagger}C=X^2+Y^2+i(XY-YX)</math>. Thus <math>XY=YX</math>. Since <math>X</math> and <math>Y</math> commute, there is a real orthogonal matrix <math>W</math> such that both <math>W X W^{\mathrm T}</math> and <math>W Y W^{\mathrm T}</math> are diagonal. Setting <math>U=W V^{\mathrm T}</math> (a unitary matrix), the matrix <math>UAU^{\mathrm T}</math> is complex diagonal. Pre-multiplying <math>U</math> by a suitable diagonal unitary matrix (which preserves unitarity of <math>U</math>), the diagonal entries of <math>UAU^{\mathrm T}</math> can be made to be real and non-negative as desired. To construct this matrix, we express the diagonal matrix as <math>UAU^\mathrm T = \
== Decomposition ==
Line 103 ⟶ 102:
[[Cholesky decomposition]] states that every real positive-definite symmetric matrix <math>A</math> is a product of a lower-triangular matrix <math>L</math> and its transpose,
:<math>A = LL^\textsf{T}.</math>
If the matrix is symmetric indefinite, it may be still decomposed as <math>PAP^\textsf{T} = LDL^\textsf{T}</math> where <math>P</math> is a permutation matrix (arising from the need to [[pivot element|pivot]]), <math>L</math> a lower unit triangular matrix, and <math>D</math> {{Relevance inline|reason=not referred to in this section|date=December 2015}} is a direct sum of symmetric <math>1 \times 1</math> and <math>2 \times 2</math> blocks, which is called Bunch–Kaufman decomposition <ref>{{cite book | author=G.H. Golub, C.F. van Loan. | title=Matrix Computations | publisher=The Johns Hopkins University Press, Baltimore, London | year=1996}}</ref>
Line 111 ⟶ 110:
:<math>A = Q \Lambda Q^\textsf{T}</math>
where <math>Q</math> is an orthogonal matrix <math>Q Q^\textsf{T} = I</math>, and <math>\Lambda</math> is a diagonal matrix of the eigenvalues of <math>A</math>. In the special case that <math>A</math> is real symmetric, then <math>Q</math> and <math>\Lambda</math> are also real. To see orthogonality, suppose <math>\mathbf x</math> and <math>\mathbf y</math> are eigenvectors corresponding to distinct eigenvalues <math>\lambda_1</math>, <math>\lambda_2</math>. Then
:<math>\lambda_1 \langle \mathbf x, \mathbf y \rangle = \langle
Since <math>\lambda_1</math> and <math>\lambda_2</math> are distinct, we have <math>\langle \mathbf x, \mathbf y \rangle = 0</math>.
== Hessian ==
|