Inverse eigenvalues theorem: Difference between revisions

Content deleted Content added
No edit summary
 
(4 intermediate revisions by 3 users not shown)
Line 1:
#REDIRECT [[Eigendecomposition of a matrix]]
{{Unreferenced|date=December 2009}}
{{Orphan|date=December 2009}}
 
In [[numerical analysis]] and [[linear algebra]], the '''Inverse eigenvalues theorem''' states that, given a matrix A that is [[nonsingular]], with [[eigenvalue]] <math>|\lambda|>0</math>, <math>\lambda</math> is an eigenvalue of <math>A</math> if and only if <math>\lambda^{-1}</math> is an eigenvalue of <math>A^{-1}</math>.
 
==Proof of the Inverse Eigenvalues Theorem==
Suppose that <math>\lambda</math> is an [[eigenvalue]] of A. Then there exists a non-zero vector <math>x \in R^n</math> such that <math>Ax = \lambda x</math>. Therefore:
 
<math>x =I_{n}x = A^{-1}Ax = A^{-1} \lambda x = \lambda A^{-1} x</math>
 
Since A is [[non-singular]], null(A) = {0} and so <math>\lambda \neq 0</math>. Therefore we may multiply both sides of the above equation by <math>\lambda^{-1}</math> to get that <math>A^{-1}x = \lambda^{-1} x</math>; i.e., <math>\lambda^{-1}</math> is an eigenvalue of <math>A^{-1}</math>. By repeating the previous argument but with A replaced by <math>A^{-1}</math> we see that if <math>\lambda^{-1}</math> is an eigenvalue of <math>A^{-1}</math> then <math>\lambda</math> is an eigenvalue of A.
 
{{DEFAULTSORT:Inverse Eigenvalues Theorem}}
[[Category:Linear algebra]]
[[Category:Mathematical theorems]]
[[Category:Articles containing proofs]]
{{algebra-stub}}