Matrix analysis: Difference between revisions

Content deleted Content added
tweaks
Importing Wikidata short description: "Study of matrices and their algebraic properties"
 
(21 intermediate revisions by 16 users not shown)
Line 1:
{{Short description|Study of matrices and their algebraic properties}}
In [[mathematics]], particularly in [[linear algebra]] and applications, '''matrix analysis''' is the study of [[matrix (mathematics)|matrices]] and their algebraic properties.<ref>{{cite book|title=Matrix Analysis|author=R. A. Horn, C. R. Johnson|year=2012|publisher=Cambridge University Press|isbn=052-183-940-8|edition=2nd|url=http://books.google.co.uk/books?id=5I5AYeeh0JUC&printsec=frontcover&dq=matrix+analysis&hl=en&sa=X&ei=bC91Ut2rCPKO7Qbh8IBI&redir_esc=y#v=onepage&q=matrix%20analysis&f=false}}
</ref> Some particular topics out of many include; operations defined on matrices (such asIn [[matrix additionmathematics]], particularly in [[matrixlinear multiplicationalgebra]] and operations derived from these)applications, functions'''matrix ofanalysis''' matricesis (suchthe asstudy [[matrix exponentiation]] andof [[matrix logarithm(mathematics)|matrices]], and eventheir [[sine]]salgebraic and cosines etcproperties. of matrices),<ref>{{cite book|title=FunctionsMatrix of Matrices: Theory and ComputationAnalysis|author=NR. JA. Horn, C. HighamR. Johnson|year=2000 2012|publisher=SIAMCambridge University Press|isbn=089978-871052-777183-9940-2|edition=2nd|url=httphttps://books.google.co.ukcom/books?id=S6gpNn1JmbgC5I5AYeeh0JUC&printsec=frontcover&dqq=matrix+functions&hl=en&sa=X&ei=_x1-UqDLE4qV7Qa5s4DAAg&redir_esc=y#v=onepage&q=matrix%20functions&f=falseanalysis}}
</ref> Some particular topics out of many include; operations defined on matrices (such as [[matrix addition]], [[matrix multiplication]] and operations derived from these), functions of matrices (such as [[matrix exponentiation]] and [[matrix logarithm]], and even [[sines and cosines]] etc. of matrices), and the [[eigenvalue]]s of matrices ([[eigendecomposition of a matrix]], [[eigenvalue perturbation]] theory).<ref>{{cite book|title=Functions of Matrices: Theory and Computation|author=N. J. Higham|year=2000 |publisher=SIAM|isbn=089-871-777-9|url=https://books.google.com/books?id=S6gpNn1JmbgC&q=matrix+functions}}
</ref> and the [[eigenvalue]]s of matrices ([[eigendecomposition of a matrix]], [[eigenvalue perturbation]] theory).
</ref>
 
==Matrix spaces==
 
The set of all ''m''&thinsp;×&thinsp;''n'' matrices over a number [[field (mathematics)|field]] ''F'' denoted in this article ''M''<sub>''mn''</sub>(''F'') form a [[vector space]]. Examples of ''F'' include the set of [[integerrational number]]s <math>\mathbb{Q}</math>, the [[real number]]s <math>\mathbb{R}</math>, and set of [[complex number]]s <math>\mathbb{C}</math>. The spaces ''M''<sub>''mn''</sub>(''F'') and ''M''<sub>''pq''</sub>(''F'') are different spaces if ''m'' and ''p'' are unequal, and if ''n'' and ''q'' are unequal; for instance ''M''<sub>32</sub>(''F'') ≠ ''M''<sub>23</sub>(''F''). Two ''m''&thinsp;×&thinsp;''n'' matrices '''A''' and '''B''' in ''M''<sub>''mn''</sub>(''F'') can be added together to form another matrix in the space ''M''<sub>''mn''</sub>(''F''):
 
:<math>\mathbf{A},\mathbf{B} \in M_{mn}(F)\,,\quad \mathbf{A} + \mathbf{B} \in M_{mn}(F) </math>
Line 17 ⟶ 18:
:<math>\alpha \mathbf{A} + \beta\mathbf{B} \in M_{mn}(F) </math>
 
where ''α'' and ''β'' are numbers in ''F''.
 
Any matrix can be expressed as a linear combination of basis matrices, which play the role of the [[basis vector]]s for the matrix space. For example, for the set of 2×22&thinsp;×&thinsp;2 matrices over the field of real numbers, ''M''<supmath>M_{22}(\mathbb{R})</supmath>(ℝ), one legitimate basis set of matrices is:
 
:<math>\begin{pmatrix}1&0\\0&0\end{pmatrix}\,,\quad
Line 26 ⟶ 27:
\begin{pmatrix}0&0\\0&1\end{pmatrix}\,,</math>
 
because any 2×22&thinsp;×&thinsp;2 matrix can be expressed as:
 
:<math>\begin{pmatrix}a&b\\c&d\end{pmatrix}=a \begin{pmatrix}1&0\\0&0\end{pmatrix}
Line 39 ⟶ 40:
{{main|Determinant}}
 
The '''determinant''' of a [[square matrix]] is an important property. The determinant indicates if a matrix is [[invertible]] (i.e. the [[inverse matrix|inverse of a matrix]] exists when the determinant is nonzero). Determinants are used for finding eigenvalues of matrices (see below), and for solving a [[system of linear equations]] (see [[Cramer's rule]]).
 
==Eigenvalues and eigenvectors of matrices==
Line 47 ⟶ 48:
===Definitions===
 
An ''n''&thinsp;×&thinsp;''n'' matrix '''A''' has '''eigenvectors''' '''x''' and '''eigenvalues''' ''λ'' defined by the relation:
 
:<math>\mathbf{A}\mathbf{x} = \lambda \mathbf{x}</math>
 
In words, the [[matrix multiplication]] of '''A''' followed by an eigenvector '''x''' (here an ''n''-dimensional [[column matrix]]), is the same as multiplying the eigenvector by the eigenvalue. For an ''n''&thinsp;×&thinsp;''n'' matrix, there are ''n'' eigenvalues. The eigenvalues are the [[root of a polynomial|roots]] of the [[characteristic polynomial]]:
 
:<math>p_\mathbf{A}(\lambda) = \det(\mathbf{A} - \lambda \mathbf{I}) = 0</math>
 
where '''I''' is the ''n''&thinsp;×&thinsp;''n'' [[identity matrix]].
 
[[Properties of polynomial roots|Roots of polynomial]]spolynomials, in this context the eigenvalues, can all be different, or some may be equal (in which case eigenvalue has [[Multiplicity (mathematics)#Multiplicity of a root of a polynomial|multiplicity]], the number of times itan eigenvalue occurs). After solving for the eigenvalues, the eigenvectors corresponding to the eigenvalues can be found by the defining equation.
 
===Perturbations of eigenvalues===
Line 67 ⟶ 68:
{{main|Matrix similarity|Change of basis}}
 
Two ''n''&thinsp;×&thinsp;''n'' matrices '''A''' and '''B''' are similar if they are related by a '''similarity transformation''':
 
:<math>\mathbf{B} = \mathbf{P}\mathbf{A}\mathbf{P}^{-1}</math>
 
The matrix '''P''' is called a '''similarity matrix''', whichand is necessarily be [[matrix inverse|invertible]], is called a '''similarity matrix'''.
 
===Unitary similarity===
Line 115 ⟶ 116:
For all matrices '''A''' and '''B''' in ''M''<sub>''mn''</sub>(''F''), and all numbers ''α'' in ''F'', a matrix norm, delimited by double vertical bars || ... ||, fulfills:<ref group="note">Some authors, e.g. Horn and Johnson, use triple vertical bars instead of double: |||'''A'''|||.</ref>
 
*[[Positive definiteNonnegative]]:
::<math>\| \mathbf{A} \| \ge 0</math>
:with equality only for '''A''' = '''0''', the [[zero matrix]].
*[[Scalar multiplication]]:
::<math>\|\alpha \mathbf{A}\|=|\alpha| \|\mathbf{A}\|</math>
*The [[triangular inequality]]:
::<math>\|\mathbf{A}+\mathbf{B}\| \leq \|\mathbf{A}\|+\|\mathbf{B}\|</math>
 
===Frobenius norm===
 
The '''Frobenius norm''' is analogous to the [[dot product]] of Euclidean vectors; square everymultiply matrix elementelements andentry-wise, add up the results, then take the positive [[square root]]:
 
:<math>\|\mathbf{A}\| = \sqrt{\mathbf{A}:\mathbf{A}} = \sqrt{\sum_{i=1}^m \sum_{j=1}^n (A_{ij})^2}</math>
 
It is defined for matrices of any dimension (i.e. no restriction to square matrices).
 
==Positive definite and semidefinite matrices ==
 
{{main|Positive definite matrix}}
Line 139 ⟶ 140:
{{main|Function (mathematics)}}
 
Matrix elements are not restricted to constant numbers, they can be [[mathematical variable]]s.
 
===Functions of matrices ===
 
A functions of a matrix takes in a matrix, and return something else (a number, vector, matrix, etc...).
 
===Matrix-valued functions ===
 
A matrix valued function takes in something (a number, vector, matrix, etc...) and returns a matrix.
Line 164 ⟶ 165:
*[[Tensor product]]
*[[Spectrum of an operator]]
*[[Matrix geometrical series]]
 
===Types of matrix===
Line 170 ⟶ 172:
*[[Symmetric matrix]], [[antisymmetric matrix]]
*[[Stochastic matrix]]
 
===Matrix functions===
 
*[[Matrix polynomial]]
*[[Matrix exponential]]
 
==Footnotes==
Line 183 ⟶ 190:
===Further reading===
 
*{{cite book|title=Matrix Analysis and Applied Linear Algebra Book and Solutions Manual|author=C. Meyer|year=2000 |publisher=SIAM|isbn=089-871-454-0|volume=2|series=Matrix Analysis and Applied Linear Algebra|url=httphttps://books.google.co.ukcom/books?id=Zg4M0iFlbGcC&printsec=frontcover&dqq=Matrix+Analysis&hl=en&sa=X&ei=SCd1UryWD_LG7Aag_4HwBg&ved=0CGoQ6AEwCQ#v=onepage&q=Matrix%20Analysis&f=false}}
*{{cite book|title=Applied Linear Algebra and Matrix Analysis|author=T. S. Shores|year=2007|publisher=Springer|isbn=978-038-733-195-9|series=[[Undergraduate Texts in Mathematics]]|url=https://books.google.com/books?id=8qwTb9P-iW8C&q=Matrix+Analysis}}
 
*{{cite book|title=Applied Linear Algebra and Matrix Analysis|author=T.Rajendra S. ShoresBhatia|year=20071997|volume=169|series=Matrix Analysis Series|publisher=Springer|isbn=038-733794-195846-6|series=Undergraduate Texts in Mathematics5|url=httphttps://books.google.co.ukcom/books?id=8qwTb9P-iW8CF4hRy1F1M6QC&printsecq=frontcover&dq=Matrixmatrix+Analysis&hl=en&sa=X&ei=SCd1UryWD_LG7Aag_4HwBg&ved=0CGQQ6AEwCA#v=onepage&q=Matrix%20Analysis&f=falseanalysis}}
*{{cite book|title=Computational Matrix Analysis|author=Alan J. Laub|year=2012|publisher=SIAM|isbn=978-161-197-221-34|url=httphttps://books.google.co.ukcom/books?id=RJBZBuHpVjEC&printsec=frontcover&dqq=Matrix+Analysis&hl=en&sa=X&ei=Iyl1UtCuEIbm7Abc4YHoCg&ved=0CDAQ6AEwADgK#v=onepage&q=Matrix%20Analysis&f=false}}
 
*{{cite book|title=Matrix Analysis|author=Rajendra Bhatia|year=1997|volume=169|series=Matrix Analysis Series|publisher=Springer|isbn=038-794-846-5|url=http://books.google.co.uk/books?id=F4hRy1F1M6QC&printsec=frontcover&dq=matrix+analysis&hl=en&sa=X&ei=_SR1UpbnNarA7AaPjIHIDA&redir_esc=y#v=onepage&q=matrix%20analysis&f=false}}
 
*{{cite book|title=Computational Matrix Analysis|author=Alan J. Laub|year=2012|publisher=SIAM|isbn=161-197-221-3|url=http://books.google.co.uk/books?id=RJBZBuHpVjEC&printsec=frontcover&dq=Matrix+Analysis&hl=en&sa=X&ei=Iyl1UtCuEIbm7Abc4YHoCg&ved=0CDAQ6AEwADgK#v=onepage&q=Matrix%20Analysis&f=false}}
 
{{mathematics-stub}}
 
[[Category:Linear algebra]]
[[Category:Matrices (mathematics)]]
[[Category:Numerical analysis]]