Matrix analysis: Difference between revisions

Content deleted Content added
Blurred Lines (talk | contribs)
Added tags to the page using Page Curation (unreferenced)
Importing Wikidata short description: "Study of matrices and their algebraic properties"
 
(25 intermediate revisions by 17 users not shown)
Line 1:
{{Short description|Study of matrices and their algebraic properties}}
{{unreferenced|date=November 2013}}
In [[mathematics]], particularly in [[linear algebra]] and applications, '''matrix analysis''' is the study of [[matrix (mathematics)|matrices]] and their algebraic properties.<ref>{{cite book|title=Matrix Analysis|author=R. A. Horn, C. R. Johnson|year=2012|publisher=Cambridge University Press|isbn=978-052-183-940-2|edition=2nd|url=https://books.google.com/books?id=5I5AYeeh0JUC&q=matrix+analysis}}
</ref> Some particular topics out of many include; operations defined on matrices (such as [[matrix addition]], [[matrix multiplication]] and operations derived from these), functions of matrices (such as [[matrix exponentiation]] and [[matrix logarithm]], and even [[sines and cosines]] etc. of matrices), and the [[eigenvalue]]s of matrices ([[eigendecomposition of a matrix]], [[eigenvalue perturbation]] theory).<ref>{{cite book|title=Functions of Matrices: Theory and Computation|author=N. J. Higham|year=2000 |publisher=SIAM|isbn=089-871-777-9|url=https://books.google.com/books?id=S6gpNn1JmbgC&q=matrix+functions}}
</ref>
 
==Matrix spaces==
In [[mathematics]], particularly in [[linear algebra]] and applications, '''matrix analysis''' refers to the study of [[matrix (mathematics)|matrices]] and their [[abstract algebra]]ic properties. Some particular topics out of many include; operations defined on matrices (such as [[matrix addition]], [[matrix multiplication]] and operations derived from these), functions of matrices (such as [[matrix exponentiation]] and [[matrix logarithm]], and even [[sine]]s and cosines etc. of matrices), <!---[[matrix calculus]] (matrix [[derivative|differentiation]] and [[integral (mathematics)|integration]]), --->and the [[eigenvalue]]s of matrices ([[eigendecomposition of a matrix]], [[eigenvalue perturbation]] theory).
 
The set of all ''m''&thinsp;×&thinsp;''n'' matrices over a [[field (mathematics)|field]] ''F'' denoted in this article ''M''<sub>''mn''</sub>(''F'') form a [[vector space]]. Examples of ''F'' include the set of [[rational number]]s <math>\mathbb{Q}</math>, the [[real number]]s <math>\mathbb{R}</math>, and set of [[complex number]]s <math>\mathbb{C}</math>. The spaces ''M''<sub>''mn''</sub>(''F'') and ''M''<sub>''pq''</sub>(''F'') are different spaces if ''m'' and ''p'' are unequal, and if ''n'' and ''q'' are unequal; for instance ''M''<sub>32</sub>(''F'') ≠ ''M''<sub>23</sub>(''F''). Two ''m''&thinsp;×&thinsp;''n'' matrices '''A''' and '''B''' in ''M''<sub>''mn''</sub>(''F'') can be added together to form another matrix in the space ''M''<sub>''mn''</sub>(''F''):
 
:<math>\mathbf{A},\mathbf{B} \in M_{mn}(F)\,,\quad \mathbf{A} + \mathbf{B} \in M_{mn}(F) </math>
 
and multiplied by a ''α'' in ''F'', to obtain another matrix in ''M''<sub>''mn''</sub>(''F''):
 
:<math>\alpha \in F \,,\quad \alpha \mathbf{A} \in M_{mn}(F) </math>
 
Combining these two properties, a [[linear combination]] of matrices '''A''' and '''B''' are in ''M''<sub>''mn''</sub>(''F'') is another matrix in ''M''<sub>''mn''</sub>(''F''):
 
:<math>\alpha \mathbf{A} + \beta\mathbf{B} \in M_{mn}(F) </math>
 
where ''α'' and ''β'' are numbers in ''F''.
 
Any matrix can be expressed as a linear combination of basis matrices, which play the role of the [[basis vector]]s for the matrix space. For example, for the set of 2&thinsp;×&thinsp;2 matrices over the field of real numbers, <math>M_{22}(\mathbb{R})</math>, one legitimate basis set of matrices is:
 
:<math>\begin{pmatrix}1&0\\0&0\end{pmatrix}\,,\quad
\begin{pmatrix}0&1\\0&0\end{pmatrix}\,,\quad
\begin{pmatrix}0&0\\1&0\end{pmatrix}\,,\quad
\begin{pmatrix}0&0\\0&1\end{pmatrix}\,,</math>
 
because any 2&thinsp;×&thinsp;2 matrix can be expressed as:
 
:<math>\begin{pmatrix}a&b\\c&d\end{pmatrix}=a \begin{pmatrix}1&0\\0&0\end{pmatrix}
+b\begin{pmatrix}0&1\\0&0\end{pmatrix}
+c\begin{pmatrix}0&0\\1&0\end{pmatrix}
+d\begin{pmatrix}0&0\\0&1\end{pmatrix}\,,</math>
 
where ''a'', ''b'', ''c'',''d'' are all real numbers. This idea applies to other fields and matrices of higher dimensions.
 
==Determinants==
 
{{main|Determinant}}
 
The '''determinant''' of a [[square matrix]] is an important property. The determinant indicates if a matrix is [[invertible]] (i.e. the [[inverse matrix|inverse of a matrix]] exists when the determinant is nonzero). Determinants are used for finding eigenvalues of matrices (see below), and for solving a [[system of linear equations]] (see [[Cramer's rule]]).
 
==Eigenvalues and eigenvectors of matrices==
 
{{main|Eigenvalues and eigenvectors}}
 
===Definitions===
 
An ''n''&thinsp;×&thinsp;''n'' matrix '''A''' has '''eigenvectors''' '''x''' and '''eigenvalues''' ''λ'' defined by the relation:
 
:<math>\mathbf{A}\mathbf{x} = \lambda \mathbf{x}</math>
 
In words, the [[matrix multiplication]] of '''A''' followed by an eigenvector '''x''' (here an ''n''-dimensional [[column matrix]]), is the same as multiplying the eigenvector by the eigenvalue. For an ''n''&thinsp;×&thinsp;''n'' matrix, there are ''n'' eigenvalues. The eigenvalues are the [[root of a polynomial|roots]] of the [[characteristic polynomial]]:
 
:<math>p_\mathbf{A}(\lambda) = \det(\mathbf{A} - \lambda \mathbf{I}) = 0</math>
 
where '''I''' is the ''n''&thinsp;×&thinsp;''n'' [[identity matrix]].
 
Roots of polynomials, in this context the eigenvalues, can all be different, or some may be equal (in which case eigenvalue has [[Multiplicity (mathematics)#Multiplicity of a root of a polynomial|multiplicity]], the number of times an eigenvalue occurs). After solving for the eigenvalues, the eigenvectors corresponding to the eigenvalues can be found by the defining equation.
 
===Perturbations of eigenvalues===
 
{{main|Eigenvalue perturbation}}
 
==Matrix similarity==
 
{{main|Matrix similarity|Change of basis}}
 
Two ''n''&thinsp;×&thinsp;''n'' matrices '''A''' and '''B''' are similar if they are related by a '''similarity transformation''':
 
:<math>\mathbf{B} = \mathbf{P}\mathbf{A}\mathbf{P}^{-1}</math>
 
The matrix '''P''' is called a '''similarity matrix''', and is necessarily [[matrix inverse|invertible]].
 
===Unitary similarity===
 
{{main|Unitary matrix}}
 
==Canonical forms==
 
{{other uses|Canonical form}}
 
===Row echelon form===
 
{{main|Row echelon form}}
 
===Jordan normal form===
 
{{main|Jordan normal form}}
 
===Weyr canonical form===
 
{{main|Weyr canonical form}}
 
===Frobenius normal form===
 
{{main|Frobenius normal form}}
 
==Triangular factorization==
 
===LU decomposition===
 
{{main|LU decomposition}}
 
'''LU decomposition''' splits a matrix into a matrix product of an upper [[triangular matrix]] and a lower triangle matrix.
 
==Matrix norms==
 
{{Main|Matrix norm}}
 
Since matrices form vector spaces, one can form axioms (analogous to those of vectors) to define a "size" of a particular matrix. The norm of a matrix is a positive real number.
 
===Definition and axioms===
 
For all matrices '''A''' and '''B''' in ''M''<sub>''mn''</sub>(''F''), and all numbers ''α'' in ''F'', a matrix norm, delimited by double vertical bars || ... ||, fulfills:<ref group="note">Some authors, e.g. Horn and Johnson, use triple vertical bars instead of double: |||'''A'''|||.</ref>
 
*[[Nonnegative]]:
::<math>\| \mathbf{A} \| \ge 0</math>
:with equality only for '''A''' = '''0''', the [[zero matrix]].
*[[Scalar multiplication]]:
::<math>\|\alpha \mathbf{A}\|=|\alpha| \|\mathbf{A}\|</math>
*The [[triangular inequality]]:
::<math>\|\mathbf{A}+\mathbf{B}\| \leq \|\mathbf{A}\|+\|\mathbf{B}\|</math>
 
===Frobenius norm===
 
The '''Frobenius norm''' is analogous to the [[dot product]] of Euclidean vectors; multiply matrix elements entry-wise, add up the results, then take the positive [[square root]]:
 
:<math>\|\mathbf{A}\| = \sqrt{\mathbf{A}:\mathbf{A}} = \sqrt{\sum_{i=1}^m \sum_{j=1}^n (A_{ij})^2}</math>
 
It is defined for matrices of any dimension (i.e. no restriction to square matrices).
 
==Positive definite and semidefinite matrices==
 
{{main|Positive definite matrix}}
 
==Functions==
 
{{main|Function (mathematics)}}
 
Matrix elements are not restricted to constant numbers, they can be [[mathematical variable]]s.
 
===Functions of matrices===
 
A functions of a matrix takes in a matrix, and return something else (a number, vector, matrix, etc...).
 
===Matrix-valued functions===
 
A matrix valued function takes in something (a number, vector, matrix, etc...) and returns a matrix.
 
==See also==
 
<!---rather than simply deleting, please include these in the article somewhere wherever relevant after the real content is written--->
 
===Other branches of analysis===
 
*[[Mathematical analysis]]
Line 11 ⟶ 160:
*[[Matrix calculus]]
*[[Numerical analysis]]
 
*[[Determinant]]
===Other concepts of linear algebra===
*[[Matrix norm]]
 
*[[Matrix similarity]]
*[[Tensor product]]
*[[Spectrum of an operator]]
*[[PositiveMatrix definitegeometrical matrixseries]]
 
*[[Unitary matrix]]
===Types of matrix===
 
*[[Orthogonal matrix]], [[unitary matrix]]
*[[Symmetric matrix]], [[antisymmetric matrix]]
*[[Stochastic matrix]]
 
===Matrix functions===
==References==
 
*[[Matrix polynomial]]
*{{cite book|title=Matrix Analysis|author=R. A. Horn, C. R. Johnson|year=2012|publisher=Cambridge University Press|isbn=052-183-940-8|edition=2nd|url=http://books.google.co.uk/books?id=5I5AYeeh0JUC&printsec=frontcover&dq=matrix+analysis&hl=en&sa=X&ei=bC91Ut2rCPKO7Qbh8IBI&redir_esc=y#v=onepage&q=matrix%20analysis&f=false}}
*[[Matrix exponential]]
 
==Footnotes==
*{{cite book|title=Matrix Analysis and Applied Linear Algebra Book and Solutions Manual|author=C. Meyer|year=2000 |publisher=SIAM|isbn=089-871-454-0|volume=2|series=Matrix Analysis and Applied Linear Algebra|url=http://books.google.co.uk/books?id=Zg4M0iFlbGcC&printsec=frontcover&dq=Matrix+Analysis&hl=en&sa=X&ei=SCd1UryWD_LG7Aag_4HwBg&ved=0CGoQ6AEwCQ#v=onepage&q=Matrix%20Analysis&f=false}}
 
{{Reflist|group="note"|1}}
 
==References==
 
===Notes===
*{{cite book|title=Applied Linear Algebra and Matrix Analysis|author=T. S. Shores|year=2007|publisher=Springer|isbn=038-733-195-6|series=Undergraduate Texts in Mathematics|url=http://books.google.co.uk/books?id=8qwTb9P-iW8C&printsec=frontcover&dq=Matrix+Analysis&hl=en&sa=X&ei=SCd1UryWD_LG7Aag_4HwBg&ved=0CGQQ6AEwCA#v=onepage&q=Matrix%20Analysis&f=false}}
 
{{reflist}}
*{{cite book|title=Matrix Analysis|author=Rajendra Bhatia|year=1997|volume=169|series=Matrix Analysis Series|publisher=Springer|isbn=038-794-846-5|url=http://books.google.co.uk/books?id=F4hRy1F1M6QC&printsec=frontcover&dq=matrix+analysis&hl=en&sa=X&ei=_SR1UpbnNarA7AaPjIHIDA&redir_esc=y#v=onepage&q=matrix%20analysis&f=false}}
 
===Further reading===
*{{cite book|title=Computational Matrix Analysis|author=Alan J. Laub|year=2012|publisher=SIAM|isbn=161-197-221-3|series=Undergraduate Texts in Mathematics|url=http://books.google.co.uk/books?id=RJBZBuHpVjEC&printsec=frontcover&dq=Matrix+Analysis&hl=en&sa=X&ei=Iyl1UtCuEIbm7Abc4YHoCg&ved=0CDAQ6AEwADgK#v=onepage&q=Matrix%20Analysis&f=false}}
 
*{{cite book|title=Matrix Analysis and Applied Linear Algebra Book and Solutions Manual|author=C. Meyer|year=2000 |publisher=SIAM|isbn=089-871-454-0|volume=2|url=https://books.google.com/books?id=Zg4M0iFlbGcC&q=Matrix+Analysis}}
{{mathematics-stub}}
*{{cite book|title=Applied Linear Algebra and Matrix Analysis|author=T. S. Shores|year=2007|publisher=Springer|isbn=978-038-733-195-9|series=[[Undergraduate Texts in Mathematics]]|url=https://books.google.com/books?id=8qwTb9P-iW8C&q=Matrix+Analysis}}
*{{cite book|title=Matrix Analysis|author=Rajendra Bhatia|year=1997|volume=169|series=Matrix Analysis Series|publisher=Springer|isbn=038-794-846-5|url=https://books.google.com/books?id=F4hRy1F1M6QC&q=matrix+analysis}}
*{{cite book|title=Computational Matrix Analysis|author=Alan J. Laub|year=2012|publisher=SIAM|isbn=978-161-197-221-4|url=https://books.google.com/books?id=RJBZBuHpVjEC&q=Matrix+Analysis}}
 
[[Category:Linear algebra]]
[[Category:Matrices (mathematics)]]
[[Category:Numerical analysis]]