Diagonal matrix: Difference between revisions

Content deleted Content added
m Reverted edits by 24.208.4.33 (talk) (HG) (3.4.12)
converted {{math}} to block <math> in diag section and definition
Line 16:
==Definition==
 
As stated above, a diagonal matrix is a matrix in which all off-diagonal entries are zero. That is, the matrix {{math|1=''D'' = (''d''<sub>''i'',''j''</sub>)}} with ''{{mvar|n''}} columns and ''{{mvar|n''}} rows is diagonal if
<math display="block">\forall i,j \in \{1, 2, \ldots, n\}, i \ne j \implies d_{i,j} = 0.</math>
 
However, the main diagonal entries are unrestricted.
 
The term ''diagonal matrix'' may sometimes refer to a '''{{visible anchor|rectangular diagonal matrix}}''', which is an ''{{mvar|m''}}-by-''{{mvar|n''}} matrix with all the entries not of the form {{math|''d''<sub>''i'',''i''</sub>}} being zero. For example:
:<math display=block>\begin{bmatrix}
1 & 0 & 0\\
0 & 4 & 0\\
0 & 0 & -3\\
0 & 0 & 0\\
\end{bmatrix}</math> \quad \text{or} \quad <math>\begin{bmatrix}
1 & 0 & 0 & 0 & 0\\
0 & 4 & 0& 0 & 0\\
0 & 0 & -3& 0 & 0
\end{bmatrix}</math>
 
Line 104:
 
== Matrix operations ==
The operations of matrix addition and [[matrix multiplication]] are especially simple for diagonal matrices. Write {{math|diag(''a''<sub>1</sub>, ..., ''a''<sub>''n''</sub>'')}} for a diagonal matrix whose diagonal entries starting in the upper left corner are {{math|''a''<sub>1</sub>, ..., ''a''<sub>''n''</sub>''}}. Then, for [[matrix addition|addition]], we have
 
<math display=block>
:{{math|diag(''a''<sub>1</sub>, ..., ''a''<sub>''n''</sub>)}} + {{math|diag(''b''<sub>1</sub>, ..., ''b''<sub>''n''</sub>)}} = {{math|diag(''a''<sub>1</sub> + ''b''<sub>1</sub>, ..., ''a''<sub>''n''</sub> + ''b''<sub>''n''</sub>)}}
\operatorname{diag}(a_1,\, \ldots,\, a_n) + \operatorname{diag}(b_1,\, \ldots,\, b_n) = \operatorname{diag}(a_1 + b_1,\, \ldots,\, a_n + b_n)</math>
 
and for [[matrix multiplication]],
 
<math display=block>\operatorname{diag}(a_1,\, \ldots,\, a_n) \operatorname{diag}(b_1,\, \ldots,\, b_n) = \operatorname{diag}(a_1 b_1,\, \ldots,\, a_n b_n).</math>
:{{math|diag(''a''<sub>1</sub>, ..., ''a''<sub>''n''</sub>)}} {{math|diag(''b''<sub>1</sub>, ..., ''b''<sub>''n''</sub>)}} = {{math|diag(''a''<sub>1</sub>''b''<sub>1</sub>, ..., ''a''<sub>''n''</sub>''b''<sub>''n''</sub>)}}.
 
The diagonal matrix {{math|diag(''a''<sub>1</sub>, ..., ''a''<sub>''n''</sub>'')}} is [[invertible matrix|invertible]] [[if and only if]] the entries {{math|''a''<sub>1</sub>, ..., ''a''<sub>''n''</sub>''}} are all nonzero. In this case, we have
 
<math display=block>\operatorname{diag}(a_1,\, \ldots,\, a_n)^{-1} = \operatorname{diag}(a_1^{-1},\, \ldots,\, a_n^{-1}).</math>
:{{math|diag(''a''<sub>1</sub>, ..., ''a''<sub>''n''</sub>)<sup>−1</sup>}} = {{math|diag(''a''<sub>1</sub><sup>−1</sup>, ..., ''a''<sub>''n''</sub><sup>−1</sup>)}}.
 
In particular, the diagonal matrices form a [[subring]] of the ring of all ''{{mvar|n''}}-by-''{{mvar|n''}} matrices.
 
Multiplying an ''{{mvar|n''}}-by-''{{mvar|n''}} matrix {{mvar|A}} from the ''left'' with {{math|diag(''a''<sub>1</sub>, ..., ''a''<sub>''n''</sub>'')}} amounts to multiplying the {{mvar|i}}-th ''row'' of {{mvar|A}} by {{mathmvar|''a''<sub>''i''</sub>}} for all {{mvar|i}}; multiplying the matrix {{mvar|A}} from the ''right'' with {{math|diag(''a''<sub>1</sub>, ..., ''a''<sub>''n''</sub>'')}} amounts to multiplying the {{mvar|i}}-th ''column'' of {{mvar|A}} by {{mathmvar|''a''<sub>''i''</sub>}} for all {{mvar|i}}.
 
== Operator matrix in eigenbasis ==
Line 135 ⟶ 136:
** A matrix is diagonal if and only if it is both [[triangular matrix|upper-]] and [[triangular matrix|lower-triangular]].
** A diagonal matrix is [[symmetric matrix|symmetric]].
* The [[identity matrix]] ''{{mvar|I''<sub>''n''</sub>}} and [[zero matrix]] are diagonal.
* A 1×1 matrix is always diagonal.
* The square of a 2×2 matrix with zero [[trace (linear algebra)|trace]] is always diagonal.
Line 142 ⟶ 143:
Diagonal matrices occur in many areas of linear algebra. Because of the simple description of the matrix operation and eigenvalues/eigenvectors given above, it is typically desirable to represent a given matrix or [[linear operator|linear map]] by a diagonal matrix.
 
In fact, a given ''{{mvar|n''}}-by-''{{mvar|n''}} matrix {{mvar|A}} is [[similar matrix|similar]] to a diagonal matrix (meaning that there is a matrix {{mvar|X}} such that {{math|''X''<sup>−1</sup>''AX''}} is diagonal) if and only if it has {{mvar|n}} [[linearly independent]] eigenvectors. Such matrices are said to be [[diagonalizable matrix|diagonalizable]].
 
Over the [[field (mathematics)|field]] of [[real number|real]] or [[complex number|complex]] numbers, more is true. The [[spectral theorem]] says that every [[normal matrix]] is [[matrix similarity|unitarily similar]] to a diagonal matrix (if {{math|1=''AA''<sup>∗</sup> = ''A''<sup>∗</sup>''A''}} then there exists a [[unitary matrix]] {{mvar|U}} such that {{math|''UAU''<sup>∗</sup>}} is diagonal). Furthermore, the [[singular value decomposition]] implies that for any matrix {{mvar|A}}, there exist unitary matrices {{mvar|U}} and {{mvar|V}} such that {{math|''U<sup>∗</sup>AV''}} is diagonal with positive entries.