Jacobi eigenvalue algorithm: Difference between revisions

Content deleted Content added
No edit summary
changed references to matrix 'A' to 'S' to match rest of page entry
Line 16:
 
== Description ==
Let ''AS'' be a symmetric matrix, and ''G'' = ''G''(''i'',''j'',''θ'') be a [[Givens rotation|Givens rotation matrix]]. Then:
 
:<math>AS'=G^\top AS G \, </math>
 
is symmetric and [[similar matrix|similar]] to ''AS''.
 
Furthermore, ''AS&prime;'' has entries:
 
:<math>\begin{align}
AS'_{ii} &= c^2\, A_S_{ii} - 2\, s c \,A_S_{ij} + s^2\, A_S_{jj} \\
AS'_{jj} &= s^2 \,A_S_{ii} + 2 s c\, A_S_{ij} + c^2 \, A_S_{jj} \\
AS'_{ij} &= AS'_{ji} = (c^2 - s^2 ) \, A_S_{ij} + s c \, (A_S_{ii} - A_S_{jj} ) \\
AS'_{ik} &= AS'_{ki} = c \, A_S_{ik} - s \, A_S_{jk} & k \ne i,j \\
AS'_{jk} &= AS'_{kj} = s \, A_S_{ik} + c \, A_S_{jk} & k \ne i,j \\
AS'_{kl} &= A_S_{kl} &k,l \ne i,j
\end{align}</math>
 
where ''s''&nbsp;=&nbsp;sin(''&theta;'') and ''c''&nbsp;=&nbsp;cos(''&theta;'').
 
Since ''G'' is orthogonal, ''AS'' and ''AS''&prime; have the same [[Frobenius norm]] ||·||<sub>F</sub> (the square-root sum of squares of all components), however we can choose ''&theta;'' such that ''AS''&prime;<sub>''ij''</sub>&nbsp;=&nbsp;0, in which case ''AS''&prime; has a larger sum of squares on the diagonal:
 
:<math> AS'_{ij} = \cos(2\theta) A_S_{ij} + \tfrac{1}{2} \sin(2\theta) (A_S_{ii} - A_S_{jj}) </math>
 
Set this equal to 0, and rearrange:
 
:<math> \tan(2\theta) = \frac{2 A_S_{ij}}{A_S_{jj} - A_S_{ii}} </math>
 
if <math> A_S_{jj} = A_S_{ii} </math>
 
:<math> \theta = \frac{\pi} {4} </math>
 
In order to optimize this effect, ''AS''<sub>''ij''</sub> should be the largest off-diagonal component, called the ''pivot''.
 
The Jacobi eigenvalue method repeatedly performs rotations until the matrix becomes almost diagonal. Then the elements in the diagonal are approximations of the (real) eigenvalues of ''AS''.
 
== Convergence ==