Content deleted Content added
No edit summary |
Entranced98 (talk | contribs) Importing Wikidata short description: "Study of matrices and their algebraic properties" |
||
(20 intermediate revisions by 16 users not shown) | |||
Line 1:
{{Short description|Study of matrices and their algebraic properties}}
</ref> Some particular topics out of many include; operations defined on matrices (such as [[matrix addition]], [[matrix multiplication]] and operations derived from these), functions of matrices (such as [[matrix exponentiation]] and [[matrix logarithm]], and even [[sines and cosines]] etc. of matrices), and the [[eigenvalue]]s of matrices ([[eigendecomposition of a matrix]], [[eigenvalue perturbation]] theory).<ref>{{cite book|title=Functions of Matrices: Theory and Computation|author=N. J. Higham|year=2000 |publisher=SIAM|isbn=089-871-777-9|url=https://books.google.com/books?id=S6gpNn1JmbgC&q=matrix+functions}}
</ref>
==Matrix spaces==
The set of all ''m'' × ''n'' matrices over a
:<math>\mathbf{A},\mathbf{B} \in M_{mn}(F)\,,\quad \mathbf{A} + \mathbf{B} \in M_{mn}(F) </math>
Line 17 ⟶ 18:
:<math>\alpha \mathbf{A} + \beta\mathbf{B} \in M_{mn}(F) </math>
where ''α'' and ''β'' are numbers in ''F''.
Any matrix can be expressed as a linear combination of basis matrices, which play the role of the [[basis vector]]s for the matrix space. For example, for the set of
:<math>\begin{pmatrix}1&0\\0&0\end{pmatrix}\,,\quad
Line 26 ⟶ 27:
\begin{pmatrix}0&0\\0&1\end{pmatrix}\,,</math>
because any
:<math>\begin{pmatrix}a&b\\c&d\end{pmatrix}=a \begin{pmatrix}1&0\\0&0\end{pmatrix}
Line 39 ⟶ 40:
{{main|Determinant}}
The '''determinant''' of a [[square matrix]] is an important property. The determinant indicates if a matrix is [[invertible]] (i.e. the [[inverse matrix|inverse of a matrix]] exists when the determinant is nonzero). Determinants are used for finding eigenvalues of matrices (see below), and for solving a [[system of linear equations]] (see [[Cramer's rule]]).
==Eigenvalues and eigenvectors of matrices==
Line 47 ⟶ 48:
===Definitions===
An ''n'' × ''n'' matrix '''A''' has '''eigenvectors''' '''x''' and '''eigenvalues''' ''λ'' defined by the relation:
:<math>\mathbf{A}\mathbf{x} = \lambda \mathbf{x}</math>
In words, the [[matrix multiplication]] of '''A''' followed by an eigenvector '''x''' (here an ''n''-dimensional [[column matrix]]), is the same as multiplying the eigenvector by the eigenvalue. For an ''n'' × ''n'' matrix, there are ''n'' eigenvalues. The eigenvalues are the [[root of a polynomial|roots]] of the [[characteristic polynomial]]:
:<math>p_\mathbf{A}(\lambda) = \det(\mathbf{A} - \lambda \mathbf{I}) = 0</math>
where '''I''' is the ''n'' × ''n'' [[identity matrix]].
===Perturbations of eigenvalues===
Line 67 ⟶ 68:
{{main|Matrix similarity|Change of basis}}
Two ''n'' × ''n'' matrices '''A''' and '''B''' are similar if they are related by a '''similarity transformation''':
:<math>\mathbf{B} = \mathbf{P}\mathbf{A}\mathbf{P}^{-1}</math>
Line 115 ⟶ 116:
For all matrices '''A''' and '''B''' in ''M''<sub>''mn''</sub>(''F''), and all numbers ''α'' in ''F'', a matrix norm, delimited by double vertical bars || ... ||, fulfills:<ref group="note">Some authors, e.g. Horn and Johnson, use triple vertical bars instead of double: |||'''A'''|||.</ref>
*[[
::<math>\| \mathbf{A} \| \ge 0</math>
:with equality only for '''A''' = '''0''', the [[zero matrix]].
*[[Scalar multiplication]]:
::<math>\|\alpha \mathbf{A}\|=|\alpha| \|\mathbf{A}\|</math>
*The [[triangular inequality]]:
::<math>\|\mathbf{A}+\mathbf{B}\| \leq \|\mathbf{A}\|+\|\mathbf{B}\|</math>
===Frobenius norm===
The '''Frobenius norm''' is analogous to the [[dot product]] of Euclidean vectors;
:<math>\|\mathbf{A}\| = \sqrt{\mathbf{A}:\mathbf{A}} = \sqrt{\sum_{i=1}^m \sum_{j=1}^n (A_{ij})^2}</math>
It is defined for matrices of any dimension (i.e. no restriction to square matrices).
==Positive definite and semidefinite matrices
{{main|Positive definite matrix}}
Line 139 ⟶ 140:
{{main|Function (mathematics)}}
Matrix elements are not restricted to constant numbers, they can be [[mathematical variable]]s.
===Functions of matrices
A functions of a matrix takes in a matrix, and return something else (a number, vector, matrix, etc...).
===Matrix-valued functions
A matrix valued function takes in something (a number, vector, matrix, etc...) and returns a matrix.
Line 164 ⟶ 165:
*[[Tensor product]]
*[[Spectrum of an operator]]
*[[Matrix geometrical series]]
===Types of matrix===
Line 170 ⟶ 172:
*[[Symmetric matrix]], [[antisymmetric matrix]]
*[[Stochastic matrix]]
===Matrix functions===
*[[Matrix polynomial]]
*[[Matrix exponential]]
==Footnotes==
Line 183 ⟶ 190:
===Further reading===
*{{cite book|title=Matrix Analysis and Applied Linear Algebra Book and Solutions Manual|author=C. Meyer|year=2000 |publisher=SIAM|isbn=089-871-454-0|volume=2
*{{cite book|title=Applied Linear Algebra and Matrix Analysis|author=T. S. Shores|year=2007|publisher=Springer|isbn=978-038-733-195-9|series=[[Undergraduate Texts in Mathematics]]|url=https://books.google.com/books?id=8qwTb9P-iW8C&q=Matrix+Analysis}}
*{{cite book|title=
*{{cite book|title=Computational Matrix Analysis|author=Alan J. Laub|year=2012|publisher=SIAM|isbn=978-161-197-221-
▲*{{cite book|title=Computational Matrix Analysis|author=Alan J. Laub|year=2012|publisher=SIAM|isbn=161-197-221-3|url=http://books.google.co.uk/books?id=RJBZBuHpVjEC&printsec=frontcover&dq=Matrix+Analysis&hl=en&sa=X&ei=Iyl1UtCuEIbm7Abc4YHoCg&ved=0CDAQ6AEwADgK#v=onepage&q=Matrix%20Analysis&f=false}}
[[Category:Linear algebra]]
[[Category:Matrices (mathematics)]]
[[Category:Numerical analysis]]
|