Triangular matrix: Difference between revisions

Content deleted Content added
Tag with {{Citation style}}
 
(46 intermediate revisions by 36 users not shown)
Line 1:
{{Short description|Special kind of square matrix}}
{{Citation style}}
{{distinguish|text=a [[triangular array]], a related concept}}
{{for|the rings|triangular matrix ring}}{{Redirects here|Triangularization|the geometric process|Triangulation}}
In the [[mathematics|mathematical]] discipline of [[linear algebra]], a '''triangular matrix''' is a special kind of [[square matrix]]. A square matrix is called '''{{visible anchor|lower triangular}}''' if all the entries ''above'' the [[main diagonal]] are zero. Similarly, a square matrix is called '''{{visible anchor|upper triangular}}''' if all the entries ''below'' the [[main diagonal]] are zero.
 
Because matrix equations with triangular matrices are easier to solve, they are very important in [[numerical analysis]]. By the [[LU decomposition]] algorithm, an [[invertible matrix]] may be written as the [[matrix multiplication|product]] of a lower triangular matrix ''L'' and an upper triangular matrix ''U'' [[if and only if]] all its leading principal [[minor (linear algebra)|minorsminor]]s are non-zero.
In the [[mathematics|mathematical]] discipline of [[linear algebra]], a '''triangular matrix''' is a special kind of [[square matrix]]. A square matrix is called '''{{visible anchor|lower triangular}}''' if all the entries ''above'' the [[main diagonal]] are zero. Similarly, a square matrix is called '''{{visible anchor|upper triangular}}''' if all the entries ''below'' the [[main diagonal]] are zero.
 
Because matrix equations with triangular matrices are easier to solve, they are very important in [[numerical analysis]]. By the [[LU decomposition]] algorithm, an [[invertible matrix]] may be written as the product of a lower triangular matrix ''L'' and an upper triangular matrix ''U'' [[if and only if]] all its leading principal [[minor (linear algebra)|minors]] are non-zero.
 
== Description ==
Line 34 ⟶ 33:
===Examples===
 
ThisThe matrix
 
:<math>\begin{bmatrix}
1 & 40 & 10 \\
02 & 696 & 40 \\
04 & 09 & 169 \\
\end{bmatrix}</math>
 
is upperthe lower triangular andfor thisthe non symmetric matrix:
 
:<math>\begin{bmatrix}
1 & 05 & 08 \\
2 & 896 & 09 \\
4 & 9 & 769 \\
\end{bmatrix}</math>
 
and
is lower triangular.
 
:<math>\begin{bmatrix}
1 & 4 & 1 \\
0 & 6 & 9 \\
0 & 0 & 1
\end{bmatrix}</math>
 
is the upper triangular for the non symmetric matrix:
 
:<math>\begin{bmatrix}
1 & 4 & 1 \\
99 & 6 & 9 \\
40 & 88 & 1
\end{bmatrix}</math>
 
==Forward and back substitution==
<!-- Section is linked from several redirects (Back substitution etc.) – please update if you change the section title -->
 
A matrix equation in the form <math>\mathbf{L}\mathbf{x} = \mathbf{b}</math> or <math>\mathbf{U}\mathbf{x} = \mathbf{b}</math> is very easy to solve by an iterative process called '''forward substitution''' for lower triangular matrices and analogously '''back substitution''' for upper triangular matrices. The process is so called because for lower triangular matrices, one first computes <math>x_1</math>, then substitutes that ''forward'' into the ''next'' equation to solve for <math>x_2</math>, and repeats through to <math>x_n</math>. In an upper triangular matrix, one works ''backwards,'' first computing <math>x_n</math>, then substituting that ''back'' into the ''previous'' equation to solve for <math>x_{n-1}</math>, and repeating through <math>x_1</math>.
The process is so called because for lower triangular matrices, one first computes <math>x_1</math>, then substitutes that ''forward'' into the ''next'' equation to solve for <math>x_2</math>, and repeats through to <math>x_n</math>. In an upper triangular matrix, one works ''backwards,'' first computing <math>x_n</math>, then substituting that ''back'' into the ''previous'' equation to solve for <math>x_{n-1}</math>, and repeating through <math>x_1</math>.
 
Notice that this does not require inverting the matrix.
 
===Forward substitution===
The matrix equation '''L'''''x''' = '''b''' can be written as a system of linear equations
 
:<math>\begin{matrix}
Line 68 ⟶ 82:
\end{matrix}</math>
 
Observe that the first equation (<math>\ell_{1,1} x_1 = b_1</math>) only involves <math>x_1</math>, and thus one can solve for <math>x_1</math> directly. The second equation only involves <math>x_1</math> and <math>x_2</math>, and thus can be solved once one substitutes in the already solved value for <math>x_1</math>. Continuing in this way, the <math>k</math>-th equation only involves <math>x_1,\dots,x_k</math>, and one can solve for <math>x_k</math> using the previously solved values for <math>x_1,\dots,x_{k-1}</math>. The resulting formulas are:
 
The resulting formulas are:
:<math>\begin{align}
x_1 &= \frac{b_1}{\ell_{1,1}}, \\
Line 78 ⟶ 90:
\end{align}</math>
 
A matrix equation with an upper triangular matrix '''U''' can be solved in an analogous way, only working backwards.
 
===Applications===
Line 91 ⟶ 103:
The [[determinant]] and [[Permanent (mathematics)|permanent]] of a triangular matrix equal the product of the diagonal entries, as can be checked by direct computation.
 
In fact more is true: the [[eigenvalue]]s of a triangular matrix are exactly its diagonal entries. Moreover, each eigenvalue occurs exactly ''k'' times on the diagonal, where ''k'' is its [[algebraic multiplicity]], that is, its [[Multiplicity of a root of a polynomial|multiplicity as a root]] of the [[characteristic polynomial]] <math>p_A(x)=\det(xI-A)</math> of ''A''. In other words, the characteristic polynomial of a triangular ''n''×''n'' matrix ''A'' is exactly
Moreover, each eigenvalue occurs exactly ''k'' times on the diagonal, where ''k'' is its [[algebraic multiplicity]], that is, its [[Multiplicity of a root of a polynomial|multiplicity as a root]] of the [[characteristic polynomial]] <math>p_A(x)=\operatorname{det}(xI-A)</math> of ''A''.
In other words, the characteristic polynomial of a triangular ''n''×''n'' matrix ''A'' is exactly
: <math>p_A(x) = (x-a_{11})(x-a_{22})\cdots(x-a_{nn})</math>,
that is, the unique degree ''n'' polynomial whose roots are the diagonal entries of ''A'' (with multiplicities).
To see this, observe that <math>xI-A</math> is also triangular and hence its determinant <math>\operatorname{det}(xI-A)</math> is the product of its diagonal entries <math>(x-a_{11})(x-a_{22})\cdots(x-a_{nn})</math>.<ref name="axler">{{HarvCite book |last = Axler |1996 first = Sheldon Jay |loc title = Linear Algebra Done Right | date = 1997 | publisher = Springer | isbn = 0-387-22595-1 | edition = 2nd | ___location = New York | oclc = 54850562 | pages =pp. 86&ndash;87, 169}}</ref>
 
==Special forms==
=== Unitriangular matrix ===
 
If the entries on the [[main diagonal]] of a (upperlower or lowerupper) triangular matrix are all 1, the matrix is called (upperlower or lowerupper) '''unitriangular'''.
 
Other names used for these matrices are '''unit''' (upperlower or lowerupper) '''triangular''', or very rarely '''normed''' (upperlower or lowerupper) '''triangular'''. However, a ''unit'' triangular matrix is not the same as '''the''' ''[[identity matrix|unit matrix]]'', and a ''normed'' triangular matrix has nothing to do with the notion of [[matrix norm]].
 
All finite unitriangular matrices are [[unipotent]].
 
=== Strictly triangular matrix ===
 
If all of the entries on the main diagonal of a (upperlower or lowerupper) triangular matrix are also 0, the matrix is called '''strictly''' (upperlower or lowerupper) '''triangular'''.
 
All finite strictly triangular matrices are [[nilpotent matrix|nilpotent]] of index at most ''n'' as a consequence of the [[Cayley–Hamilton theorem|Cayley-Hamilton theorem]].
 
=== Atomic triangular matrix ===
{{Main|Frobenius matrix}}
An '''atomic''' (upperlower or lowerupper) '''triangular matrix''' is a special form of unitriangular matrix, where all of the [[off-diagonal element]]s are zero, except for the entries in a single column. Such a matrix is also called a '''Frobenius matrix''', a '''Gauss matrix''', or a '''Gauss transformation matrix'''.
 
=== Block triangular matrix ===
{{Main|Block matrix}}
A block triangular matrix is a [[block matrix]] (partitioned matrix) that is a triangular matrix.
 
====Upper block triangular====
A matrix <math>A</math> is '''upper block triangular''' if
 
:<math>A = \begin{bmatrix}
A_{11} & A_{12} & \cdots & A_{1k} \\
0 & A_{22} & \cdots & A_{2k} \\
\vdots & \vdots & \ddots & \vdots \\
0 & 0 & \cdots & A_{kk}
\end{bmatrix}</math>,
 
where <math>A_{ij} \in \mathbb{F}^{n_i \times n_j}</math> for all <math>i, j = 1, \ldots, k</math>.<ref name="bernstein2009">{{Cite book |last=Bernstein |first=Dennis S. |title=Matrix mathematics: theory, facts, and formulas |publisher=Princeton University Press |year=2009 |isbn=978-0-691-14039-1 |edition=2 |___location=Princeton, NJ |pages=168 |language=en}}</ref>
 
====Lower block triangular====
A matrix <math>A</math> is '''lower block triangular''' if
 
:<math>A = \begin{bmatrix}
A_{11} & 0 & \cdots & 0 \\
A_{21} & A_{22} & \cdots & 0 \\
\vdots & \vdots & \ddots & \vdots \\
A_{k1} & A_{k2} & \cdots & A_{kk}
\end{bmatrix}</math>,
 
where <math>A_{ij} \in \mathbb{F}^{n_i \times n_j}</math> for all <math>i, j = 1, \ldots, k</math>.<ref name="bernstein2009" />
 
==Triangularisability{{Anchor|Triangularizability}}==
A matrix that is [[similar matrix|similar]] to a triangular matrix is referred to as '''triangularizable'''. Abstractly, this is equivalent to stabilizing a [[flag (linear algebra)|flag]]: upper triangular matrices are precisely those that preserve the [[standard flag]], which is given by the standard ordered basis <math>(e_1,\ldots,e_n)</math> and the resulting flag <math>0 < \left\langle e_1\right\rangle < \left\langle e_1,e_2\right\rangle < \cdots < \left\langle e_1,\ldots,e_n \right\rangle = K^n.</math> All flags are conjugate (as the [[general linear group]] acts transitively on bases), so any matrix that stabilises a flag is similar to one that stabilisesstabilizes the standard flag.
 
Any complex square matrix is triangularizable.<ref name="axler"/> In fact, a matrix ''A'' over a [[field (mathematics)|field]] containing all of the eigenvalues of ''A'' (for example, any matrix over an [[algebraically closed field]]) is similar to a triangular matrix. This can be proven by using induction on the fact that ''A'' has an eigenvector, by taking the quotient space by the eigenvector and inducting to show that ''A'' stabilisesstabilizes a flag, and is thus triangularizable with respect to a basis for that flag.
 
A more precise statement is given by the [[Jordan normal form]] theorem, which states that in this situation, ''A'' is similar to an upper triangular matrix of a very particular form. The simpler triangularization result is often sufficient however, and in any case used in proving the Jordan normal form theorem.<ref name="axler"/><ref name="herstein">{{HarvCite book | last = Herstein | first = I. N. | title = Topics in Algebra | date = 1975 |loc publisher = Wiley | isbn = 0-471-01090-1 | edition = 2nd | ___location = New York | oclc = 3307396 | pages =pp. 285&ndash;290}}</ref>
 
In the case of complex matrices, it is possible to say more about triangularization, namely, that any square matrix ''A'' has a [[Schur decomposition]]. This means that ''A'' is unitarily equivalent (i.e. similar, using a [[unitary matrix]] as change of basis) to an upper triangular matrix; this follows by taking an Hermitian basis for the flag.
Line 132 ⟶ 170:
The basic result is that (over an algebraically closed field), the [[commuting matrices]] <math>A,B</math> or more generally <math>A_1,\ldots,A_k</math> are simultaneously triangularizable. This can be proven by first showing that commuting matrices have a common eigenvector, and then inducting on dimension as before. This was proven by Frobenius, starting in 1878 for a commuting pair, as discussed at [[commuting matrices]]. As for a single matrix, over the complex numbers these can be triangularized by unitary matrices.
 
The fact that commuting matrices have a common eigenvector can be interpreted as a result of [[Hilbert's Nullstellensatz]]: commuting matrices form a commutative algebra <math>K[A_1,\ldots,A_k]</math> over <math>K[x_1,\ldots,x_k]</math> which can be interpreted as a variety in ''k''-dimensional affine space, and the existence of a (common) eigenvalue (and hence a common eigenvector) corresponds to this variety having a point (being non-empty), which is the content of the (weak) Nullstellensatz.{{Citation needed|reason=The existence of a common eigenvector is not clear, see https://mathoverflow.net/questions/43298/commuting-matrices-and-the-weak-nullstellensatz|date=March 2021}} In algebraic terms, these operators correspond to an [[algebra representation]] of the polynomial algebra in ''k'' variables.
 
This is generalized by [[Lie's theorem]], which shows that any representation of a [[solvable Lie algebra]] is simultaneously upper triangularizable, the case of commuting matrices being the [[abelian Lie algebra]] case, abelian being a fortiori solvable.
 
More generally and precisely, a set of matrices <math>A_1,\ldots,A_k</math> is simultaneously triangularisable if and only if the matrix <math>p(A_1,\ldots,A_k)[A_i,A_j]</math> is [[nilpotent]] for all polynomials ''p'' in ''k'' ''non''-commuting variables, where <math>[A_i,A_j]</math> is the [[commutator]]; for commuting <math>A_i</math> the commutator vanishes so this holds. This was proven by Drazin, Dungey, and Gruenberg in 1951;<ref>{{HarvCite journal | last1 = Drazin | first1 = M. P. | last2 = Dungey | first2 = J. W. | last3 = Gruenberg | first3 = K. W. | date = 1951 | title = Some Theorems on Commutative Matrices |url = http://jlms.oxfordjournals.org/cgi/pdf_extract/s1-26/3/221 | journal = Journal of the London Mathematical Society | language = en | volume = 26 | issue = 3 | pages = 221–228 | doi = 10.1112/jlms/s1-26.3.221}};</ref> a brief proof is given by Prasolov in 1994.<ref>{{HarvCite book | last = Prasolov |1994|loc first =[https://books V.google V.com/books?id | title =fuONq1od6nsC&pg Problems and Theorems in Linear Algebra | pages =PA178 pp178–179 | date = 1994 | publisher = American Mathematical Society | others = Simeon Ivanov | isbn = 9780821802366 |___location=Providence, R.I. 178–179]| oclc = 30076024}}.</ref> One direction is clear: if the matrices are simultaneously triangularisable, then <math>[A_i, A_j]</math> is ''strictly'' upper triangularizable (hence nilpotent), which is preserved by multiplication by any <math>A_k</math> or combination thereof – it will still have 0s on the diagonal in the triangularizing basis.
 
== Algebras of triangular matrices ==
Line 144 ⟶ 182:
* The sum of two upper triangular matrices is upper triangular.
* The product of two upper triangular matrices is upper triangular.
* The [[inverse matrix|inverse]] of an upper triangular matrix, whereif it extantexists, is upper triangular.
* The product of an upper triangular matrix and a scalar is upper triangular.
 
Line 163 ⟶ 201:
===Borel subgroups and Borel subalgebras===
{{main|Borel subgroup|Borel subalgebra}}
The set of invertible triangular matrices of a given kind (upperlower or lowerupper) forms a [[group (mathematics)|group]], indeed a [[Lie group]], which is a subgroup of the [[general linear group]] of all invertible matrices. A triangular matrix is invertible precisely when its diagonal entries are invertible (non-zero).
when its diagonal entries are invertible (non-zero).
 
Over the real numbers, this group is disconnected, having <math>2^n</math> components accordingly as each diagonal entry is positive or negative. The identity component is invertible triangular matrices with positive entries on the diagonal, and the group of all invertible triangular matrices is a [[semidirect product]] of this group and the group of [[Diagonal matrix|diagonal matrices]] with <math>\pm 1</math> on the diagonal, corresponding to the components.
 
The [[Lie algebra]] of the Lie group of invertible upper triangular matrices is the set of all upper triangular matrices, not necessarily invertible, and is a [[solvable Lie algebra]]. These are, respectively, the standard [[Borel subgroup]] ''B'' of the Lie group GL<sub>''n''</sub> and the standard [[Borel subalgebra]] <math>\mathfrak{b}</math> of the Lie algebra gl<sub>''n''</sub>.
 
The upper triangular matrices are precisely those that stabilize the [[Flag (linear algebra)|standard flag]]. The invertible ones among them form a subgroup of the general linear group, whose conjugate subgroups are those defined as the stabilizer of some (other) complete flag. These subgroups are [[Borel subgroup]]s. The group of invertible lower triangular matrices is such a subgroup, since it is the stabilizer of the standard flag associated to the standard basis in reverse order.
 
The stabilizer of a partial flag obtained by forgetting some parts of the standard flag can be described as a set of block upper triangular matrices (but its elements are ''not'' all triangular matrices). The conjugates of such a group are the subgroups defined as the stabilizer of some partial flag. These subgroups are called [[parabolic subgroup]]ssubgroups.
 
=== Examples ===
The group of 2 by 22×2 upper unitriangular matrices is [[isomorphic]] to the [[Abelian group|additive group]] of the field of scalars; in the case of complex numbers it corresponds to a group formed of parabolic [[Möbius transformation]]s; the 3 by 33×3 upper unitriangular matrices form the [[Heisenberg group]].
 
== See also ==
Line 187 ⟶ 224:
== References ==
{{reflist}}
{{refbegin}}
* {{Citation | first = Sheldon | last = Axler | title = Linear Algebra Done Right | publisher = Springer-Verlag | year = 1996 | isbn=0-387-98258-2}}
* {{Citation | first1 = M. P. | last1 = Drazin | first2 = J. W. | last2 = Dungey | first3 = K. W. | last3 = Gruenberg | title = Some theorems on commutative matrices | journal = J. London Math. Soc. | volume = 26 | pages = 221–228 | year = 1951 | url = http://jlms.oxfordjournals.org/cgi/pdf_extract/s1-26/3/221 |doi=10.1112/jlms/s1-26.3.221 | issue = 3}}
* {{Citation | first = I. N. | last = Herstein | title = Topics in Algebra | edition = 2nd | publisher = John Wiley and Sons | year = 1975 | isbn = 0-471-01090-1 | url-access = registration | url = https://archive.org/details/topicsinalgebra00hers }}
* {{Citation | title = Problems and theorems in linear algebra | first = Viktor | last = Prasolov | year = 1994 | url = https://books.google.com/books?id=fuONq1od6nsC&lpg=PP1&dq=victor%20prasolov%20Problems%20and%20theorems%20in%20linear%20algebra&pg=PP1#v=onepage&q&f=false | isbn = 9780821802366 }}
{{refend}}
 
{{Matrix classes}}
{{Numerical linear algebra}}
 
{{DEFAULTSORT:Triangular Matrix}}
[[Category:Numerical linear algebra]]
[[Category:Matrices (mathematics)]]