Content deleted Content added
Line 76:
The Lanczos algorithm is most often brought up in the context of finding the [[eigenvalue]]s and [[eigenvector]]s of a matrix, but whereas an ordinary [[Matrix diagonalization|diagonalization of a matrix]] would make eigenvectors and eigenvalues apparent from inspection, the same is not true for the tridiagonalization performed by the Lanczos algorithm; nontrivial additional steps are needed to compute even a single eigenvalue or eigenvector. None the less, applying the Lanczos algorithm is often a significant step forward in computing the eigendecomposition.
First observe that when <math>T</math> is <math>n \times n</math>, it is [[Matrix similarity|similar]] to <math>A</math>: if <math>\lambda</math> is an eigenvalue of <math>T</math> then it is also an eigenvalue of <math>A</math>, and if <math> T x = \lambda x </math> (<math>x</math> is an eigenvector of <math>T</math>) then <math> y = V x </math> is the corresponding eigenvetor of <math>A</math> (since <math> A y = A V x = V T V^* V x = V T I x = V T x = V (\lambda x) = \lambda V x = \lambda y</math>). Thus the Lanczos algorithm transforms the eigendecomposition problem for <math>A</math> into the eigendecomposition problem for <math>T</math>.
# For tridiagonal matrices, there exist a number of specialised algorithms, often with better computational complexity than general-purpose algorithms. For example if <math>T</math> is an <math>m \times m</math> tridiagonal symmetric matrix then:
#* The [[Tridiagonal matrix#Determinant|continuant recursion]] allow computing the [[characteristic polynomial]] in <math>O(m^2)</math> operations, and evaluating it at a point in <math>O(m)</math> operations.
#* The Multiple Relatively Robust Representations (MRRR)<ref>This probably refers to the method described in: {{cite journal|last1=Dhillon|first1=Inderjit S.|last2=Parlett|first2=Beresford N.|title=Multiple representations to compute orthogonal eigenvectors of symmetric tridiagonal matrices|journal=Linear Algebra and its Applications|volume=387|year=2004|pages=1–28|issn=00243795|doi=10.1016/j.laa.2003.12.028}}</ref> method allows computing all eigenvalues and eigenvectors of <math>T</math> in as little as <math>O(m^2)</math> operations.
[[Special:Contributions/130.243.68.122|130.243.68.122]] ([[User talk:130.243.68.122|talk]]) 12:02, 24 May 2017 (UTC)▼
# Some general eigendecomposition algorithms, notably the [[QR algorithm]], are known to converge faster for tridiagonal matrices than for general matrices.
# Even algorithms whose convergence rates are unaffected by unitary transformations, notably the [[power method]], may in practice benefit from being applied to the tridiagonal matrix <math>T</math> rather than the original matrix <math>A</math>. Since <math>T</math> is very sparse with all nonzero elements is highly predictable positions, it permits compact storage with excellent performance visavi [[Cache (computing)|caching]]. Likewise, <math>T</math> is a [[real number|real]] matrix with all eigenvectors and eigenvalues real, whereas <math>A</math> in general may have complex elements and eigenvectors, so real arithmetic is sufficient for finding the eigenvectors and eigenvalues of <math>T</math>.
# If <math>n</math> is very large, then reducing <math>m</math> so that <math>T</math> is of a manageable size will still allow finding the more extreme eigenvalues and eigenvectors of <math>A</math>; in the <math>m \ll n</math> region, the Lanczos algorithm can be viewed as a [[lossy compression]] scheme for Hermitian matrices, that emphasises preserving the extreme eigenvalues.
The combination of good performance for sparse matrices and the ability to compute several (but not all) eigenvalues are the main reasons for choosing to use the Lanczos algorithm.
▲[[Special:Contributions/130.243.68.122|130.243.68.122]] ([[User talk:130.243.68.122|talk]])
== Define variables ==
|