Diagonal matrix: Difference between revisions

Content deleted Content added
Boldfaced vector variables
Line 1:
{{Use American English|date = March 2019}}
{{Short description|Matrix whose only nonzero elements are on its main diagonal}}
In [[linear algebra]], a '''diagonal matrix''' is a [[matrix (mathematics)|matrix]] in which the entries outside the [[main diagonal]] are all zero; the term usually refers to [[square matrices]]. An example of a 2-by-22×2 diagonal matrix is <math>\left[\begin{smallmatrix}
3 & 0 \\
0 & 2 \end{smallmatrix}\right]</math>, while an example of a 3-by-33×3 diagonal matrix is<math>
\left[\begin{smallmatrix}
6 & 0 & 0 \\
Line 14:
==Definition==
 
As stated above, a diagonal matrix is a matrix in which all off-diagonal entries are zero. That is, the matrix {{nowrap|1=''D'' = (''d''<sub>''i'',''j''</sub>)}} with ''n'' columns and ''n'' rows is diagonal if
 
:<math>\forall i,j \in \{1, 2, \ldots, n\}, i \ne j \implies d_{i,j} = 0.</math>.
 
However, the main diagonal entries are unrestricted.
Line 60:
 
== Vector operations ==
Multiplying a vector by a diagonal matrix multiplies each of the terms by the corresponding diagonal entry. Given a diagonal matrix <math>D = \operatorname{diag}(a_1, \dots, a_n)</math> and a vector <math>\mathbf{v} = \begin{bmatrix}x_1 & \dotsm & x_n\end{bmatrix}^\textsf{T}</math>, the product is:
:<math>DvD\mathbf{v} = \operatorname{diag}(a_1, \dots, a_n)\begin{bmatrix}x_1 \\ \vdots \\ x_n\end{bmatrix} =
\begin{bmatrix}
a_1 \\
Line 71:
</math>
 
This can be expressed more compactly by using a vector instead of a diagonal matrix, <math>\mathbf{d} = \begin{bmatrix}a_1 & \dotsm & a_n\end{bmatrix}^\textsf{T}</math>, and taking the [[Hadamard product (matrices)|Hadamard product]] of the vectors (entrywise product), denoted <math>\mathbf{d} \circ \mathbf{v}</math>:
 
:<math>DvD\mathbf{v} = \mathbf{d} \circ \mathbf{v} =
\begin{bmatrix}a_1 \\ \vdots \\ a_n\end{bmatrix} \circ \begin{bmatrix}x_1 \\ \vdots \\ x_n\end{bmatrix} =
\begin{bmatrix}a_1 x_1 \\ \vdots \\ a_n x_n\end{bmatrix}.
Line 101:
{{Main|Transformation_matrix#Finding the matrix of a transformation|Eigenvalues and eigenvectors|l1=Finding the matrix of a transformation}}
 
As explained in [[transformation matrix#Finding the matrix of a transformation|determining coefficients of operator matrix]], there is a special basis, '''e'''<sub>1</sub>, ..., '''e'''<sub>''n''</sub>, for which the matrix <math>A</math> takes the diagonal form. Hence, in the defining equation <math display="inline">A \vecmathbf e_j = \sum a_{i,j} \vecmathbf e_i</math>, all coefficients <math>a_{i,j} </math> with ''i'' ≠ ''j'' are zero, leaving only one term per sum. The surviving diagonal elements, <math>a_{i,i}</math>, are known as '''eigenvalues''' and designated with <math>\lambda_i</math> in the equation, which reduces to <math>A \vecmathbf e_i = \lambda_i \vecmathbf e_i</math>. The resulting equation is known as '''eigenvalue equation'''<ref>{{cite book |last=Nearing |first=James |year=2010 |title=Mathematical Tools for Physics |url=http://www.physics.miami.edu/nearing/mathmethods |chapter=Chapter 7.9: Eigenvalues and Eigenvectors |chapter-url= http://www.physics.miami.edu/~nearing/mathmethods/operators.pdf |access-date=January 1, 2012|isbn=048648212X}}</ref> and used to derive the [[characteristic polynomial]] and, further, [[eigenvalues and eigenvectors]].
 
In other words, the [[eigenvalue]]s of {{nowrap|diag(''λ''<sub>1</sub>, ..., ''λ''<sub>''n''</sub>)}} are ''λ''<sub>1</sub>, ..., ''λ''<sub>''n''</sub> with associated [[eigenvectors]] of '''e'''<sub>1</sub>, ..., '''e'''<sub>''n''</sub>.
 
== Properties ==