Content deleted Content added
m →Eigenspace and spectrum: bolden the term 'geometric multiplicity', which redirects here |
→Eigenspace and spectrum: clean up, remove irrelevant stuff (such as that one can choose bases in eigenspaces) |
||
Line 87:
Some authors allow <math>v</math> to be the [[zero vector]] in the definition of eigenvector.<ref>{{Citation|last=Axler|first= Sheldon |title=Linear Algebra Done Right|edition=2nd |chapter=Ch. 5|page= 77}}</ref> This is reasonable as long as we define eigenvalues and eigenvectors carefully: If we would like the zero vector to be an eigenvector, then we must first define an eigenvalue of <math> T </math> as a scalar <math> \lambda </math> in <math>K</math> such that there is a ''nonzero'' vector <math> v </math> in <math>V</math> with <math> T(v) = \lambda v </math>. We then define an eigenvector to be a vector <math> v </math> in <math>V</math> such that there is an eigenvalue <math> \lambda </math> in <math>K</math> with <math> T(v) = \lambda v </math>. This way, we ensure that it is not the case that every scalar is an eigenvalue corresponding to the zero vector.
===Eigenspace and spectrum=== <!-- Geometric multiplicity links here -->
If <math>v</math> is an eigenvector of <math>T</math>, with eigenvalue <math>\lambda</math>, then any [[scalar multiplication|scalar multiple]] <math>\alpha v </math> of <math>v</math> with nonzero <math>\alpha</math> is also an eigenvector with eigenvalue <math>\lambda</math>, since <math>T(\alpha v) = \alpha T(v) = \alpha(\lambda v) = \lambda(\alpha v)</math>. Moreover, if <math>u</math> and <math>v</math> are eigenvectors with the same eigenvalue <math>\lambda</math>, then <math>u+v</math> is also an eigenvector with the same eigenvalue <math>\lambda</math>. Therefore, the set of all eigenvectors with the same eigenvalue <math>\lambda</math>, together with the zero vector, is a [[linear subspace]] of <math>V</math>, called the '''eigenspace''' of <math>T</math> associated to <math>\lambda</math>.<ref>{{Harvnb|Shilov|1977|loc=p. 109}}</ref><ref>[[b:The Book of Mathematical Proofs/Algebra/Linear Transformations#Lemma for the eigenspace|Lemma for the eigenspace]]</ref> If that subspace has dimension 1, it is sometimes called an '''eigenline'''.<ref>''[http://books.google.com/books?id=pkESXAcIiCQC&pg=PA111 Schaum's Easy Outline of Linear Algebra]'', p. 111</ref>▼
▲If <math>v</math> is an eigenvector of <math>T</math>, with eigenvalue <math>\lambda</math>, then any [[scalar multiplication|scalar multiple]] <math>\alpha v </math> of <math>v</math> with nonzero <math>\alpha</math> is also an eigenvector with eigenvalue <math>\lambda</math>, since <math>T(\alpha v) = \alpha T(v) = \alpha(\lambda v) = \lambda(\alpha v)</math>. Moreover, if <math>u</math> and <math>v</math> are eigenvectors with the same eigenvalue <math>\lambda</math>, then <math>u+v</math> is also an eigenvector with the same eigenvalue <math>\lambda</math>. Therefore, the set of all eigenvectors with the same eigenvalue <math>\lambda</math>, together with the zero vector, is a [[linear subspace]] of <math>V</math>, called the '''eigenspace''' of <math>T</math> associated to <math>\lambda</math>.<ref>{{Harvnb|Shilov|1977|loc=p. 109}}</ref><ref>[[b:The Book of Mathematical Proofs/Algebra/Linear Transformations#Lemma for the eigenspace|Lemma for the eigenspace]]</ref> If that subspace has dimension 1, it is sometimes called an '''eigenline'''.<ref>''[http://books.google.com/books?id=pkESXAcIiCQC&pg=PA111 Schaum's Easy Outline of Linear Algebra]'', p. 111</ref>
The '''geometric multiplicity''' <math>\gamma_T(\lambda)</math> of an eigenvalue <math>\lambda</math> is the dimension of the eigenspace associated to <math>\lambda</math>, i.e. number of [[linear independence|linearly independent]] eigenvectors with that eigenvalue. These eigenvectors can be chosen so that they are pairwise [[orthogonal]] and have unit length under some arbitrary [[inner product]] defined on <math>V</math>. In other words, every eigenspace has an [[orthonormal basis]] of eigenvectors.▼
▲The '''geometric multiplicity''' <math>\gamma_T(\lambda)</math> of an eigenvalue <math>\lambda</math> is the dimension of the eigenspace associated to <math>\lambda</math>, i.e. number of [[linear independence|linearly independent]] eigenvectors with that eigenvalue
Conversely, any eigenvector with eigenvalue <math>\lambda</math> must be linearly independent from all eigenvectors that are associated to a different eigenvalue <math>\lambda'</math>. Therefore a linear transformation <math>T</math> that operates on an <math>n</math>-[[dimension (mathematics)|dimensional space]] cannot have more than <math>n</math> distinct eigenvalues (or eigenspaces).<ref name="Shilov_lemma">For a proof of this lemma, see {{Harvnb|Roman|2008|loc=Theorem 8.2 on p. 186}}; {{Harvnb|Shilov|1977|loc=p. 109}}; {{Harvnb|Hefferon|2001|loc=p. 364}}; {{Harvnb|Beezer|2006|loc=Theorem EDELI on p. 469}}; and [[b:Famous Theorems of Mathematics/Algebra/Linear Transformations#Lemma for linear independence of eigenvectors|Lemma for linear independence of eigenvectors]]</ref>▼
▲
The set of eigenvalues of <math>T</math> is sometimes called the [[Spectrum of a matrix|spectrum]] of <math>T</math>.
===Eigenbasis===
|