Content deleted Content added
provide more scope, even if the sections are mostly empty |
tweaks |
||
Line 1:
In [[mathematics]], particularly in [[linear algebra]] and applications, '''matrix analysis''' is the study of [[matrix (mathematics)|matrices]] and their algebraic properties.<ref>{{cite book|title=Matrix Analysis|author=R. A. Horn, C. R. Johnson|year=2012|publisher=Cambridge University Press|isbn=052-183-940-8|edition=2nd|url=http://books.google.co.uk/books?id=5I5AYeeh0JUC&printsec=frontcover&dq=matrix+analysis&hl=en&sa=X&ei=bC91Ut2rCPKO7Qbh8IBI&redir_esc=y#v=onepage&q=matrix%20analysis&f=false}}
</ref> Some particular topics out of many include; operations defined on matrices (such as [[matrix addition]], [[matrix multiplication]] and operations derived from these), functions of matrices (such as [[matrix exponentiation]] and [[matrix logarithm]], and even [[sine]]s and cosines etc. of matrices),
</ref> and the [[eigenvalue]]s of matrices ([[eigendecomposition of a matrix]], [[eigenvalue perturbation]] theory).
==Matrix spaces==
Line 17 ⟶ 18:
where ''α'' and ''β'' are numbers in ''F''.
Any matrix can be expressed as a linear combination of basis matrices, which play the role of the [[basis vector]]s for the matrix space. For example, for the set of 2×2 matrices over the field of real numbers, ''M''<sup>22</sup>(ℝ), one legitimate basis set of matrices is:
:<math>\begin{pmatrix}1&0\\0&0\end{pmatrix}\,,\quad
\begin{pmatrix}0&1\\0&0\end{pmatrix}\,,\quad
\begin{pmatrix}0&0\\1&0\end{pmatrix}\,,\quad
\begin{pmatrix}0&0\\0&1\end{pmatrix}\,,</math>
because any 2×2 matrix can be expressed as:
:<math>\begin{pmatrix}a&b\\c&d\end{pmatrix}=a \begin{pmatrix}1&0\\0&0\end{pmatrix}
+b\begin{pmatrix}0&1\\0&0\end{pmatrix}
+c\begin{pmatrix}0&0\\1&0\end{pmatrix}
+d\begin{pmatrix}0&0\\0&1\end{pmatrix}\,,</math>
where ''a'', ''b'', ''c'',''d'' are all real numbers. This idea applies to other fields and matrices of higher dimensions.
==Determinants==
Line 34 ⟶ 51:
:<math>\mathbf{A}\mathbf{x} = \lambda \mathbf{x}</math>
In words, the [[matrix multiplication]] of '''A''' followed by an eigenvector '''x''' (here an ''n''-dimensional [[column matrix]]), is the same as multiplying the eigenvector by the eigenvalue. For an ''n''×''n'' matrix, there are ''n'' eigenvalues. The eigenvalues are the
:<math>p_\mathbf{A}(\lambda) = \det(\mathbf{A} - \lambda \mathbf{I}) = 0</math>
where '''I''' is the ''n''×''n'' [[identity matrix]].
[[Properties of polynomial roots|Roots of
===Perturbations of eigenvalues===
Line 48 ⟶ 65:
==Matrix similarity==
{{main|Matrix similarity|Change of basis}}
Two ''n''×''n'' matrices '''A''' and '''B''' are similar if they are related by a '''similarity transformation''':
Line 54 ⟶ 71:
:<math>\mathbf{B} = \mathbf{P}\mathbf{A}\mathbf{P}^{-1}</math>
The matrix '''P''', which is necessarily be [[matrix inverse|invertible]], is called a '''similarity matrix'''.
===Unitary similarity===
Line 135 ⟶ 152:
<!---rather than simply deleting, please include these in the article somewhere wherever relevant after the real content is written--->
===Other branches of analysis===
*[[Mathematical analysis]]
Line 140 ⟶ 159:
*[[Matrix calculus]]
*[[Numerical analysis]]
===Other concepts of linear algebra===
*[[Tensor product]]
*[[Spectrum of an operator]]
===Types of matrix===
*[[Orthogonal matrix]], [[unitary matrix]]
*[[Symmetric matrix]], [[antisymmetric matrix]]
*[[Stochastic matrix]]
Line 157 ⟶ 182:
===Further reading===
*{{cite book|title=Matrix Analysis and Applied Linear Algebra Book and Solutions Manual|author=C. Meyer|year=2000 |publisher=SIAM|isbn=089-871-454-0|volume=2|series=Matrix Analysis and Applied Linear Algebra|url=http://books.google.co.uk/books?id=Zg4M0iFlbGcC&printsec=frontcover&dq=Matrix+Analysis&hl=en&sa=X&ei=SCd1UryWD_LG7Aag_4HwBg&ved=0CGoQ6AEwCQ#v=onepage&q=Matrix%20Analysis&f=false}}
|