Content deleted Content added
→As a multidimensional Newton's method: remove section, it is not supported by the sources and gives no new insights |
→As inverse power iteration: restructured contents |
||
Line 73:
=== As inverse power iteration ===
All stages of the Jenkins-Traub complex algorithm may be represented as the linear algebra problem of determining the eigenvalues of a special matrix. This matrix is the coordinate representation of a linear map in the ''n''-dimensional space of polynomials of degree ''n-1'' or less. The principal idea of this map is to interpret the factorization
:<math>P(X)=(X-\alpha_1)\cdot P_1(X)</math>
:<math>M_X(H)=(X\cdot H(X)) \
This maps polynomials of degree at most ''n-1'' to polynomials of degree at most ''n-1''. The eigenvalues of this map are the roots of ''P(X)'', since the eigenvector equation reads
:<math>0=(M_X-\
which implies that <math>(X-\
:<math> M_X(H)=\sum_{m=1}^{n-1}(H_{m-1}-P_{m}H_{n-1})X^m-P_0H_{n-1}\,,</math>
:<math>A=\begin{pmatrix}
0 & 0 & \dots & 0 & -P_0 \\
Line 89:
0 & 0 & \dots & 1 & -P_{n-1} \\
\end{pmatrix}\,.</math>
:<math>H^{k+1}=M_X{}^{-1}(H^k)</math> resp. <math>\
:<math>H^{k+1}=(
:<math>\alpha_1-s-q_k=O\left(\left(\tfrac{|\alpha_1-s|}{|\alpha_2-s|}\right)^k\right)</math>
which is the faster the smaller <math>|\alpha_1-s|<\!<|\alpha_2-s|=\min{}_{j=2,\dots,n}|\alpha_j-s|</math> is.
''Stage 3:'' Here the shift is updated every step using
:<math>H^{k
to obtain an update to <math>s_k</math> from the growth factor between <math>H^{k+1}</math> and <math>H_1^{k+1}</math>,
with some linear functional ''L''. If <math>\delta_k</math> denotes the distance of the normalized iteration vector <math>\bar H^k</math> from the eigenspace of <math>\alpha_1</math>, then one gets the for the convergence speed the recursions
:<math>\alpha_1-s_{k+1}=O(\delta_k\,|\alpha_1-s_k|^2)</math> and <math>\delta_{k+1}=O(\delta_k\cdot |\alpha_1-s_k|)</math>
resulting in an asymptotic convergence order of <math>\phi^2=1+\phi\approx 2.61</math>, where <math>\phi=\tfrac{1+\sqrt{5}}2</math> is the [[golden ratio]].
▲:<math>H^k(X)=(M_X-s\cdot id)(H^{k+1})=((X-s)\cdot H^{k+1}(X) \bmod P(X))</math>
To solve the polynomial identity representing the inverse power iteration, one has to find a factor <math>Q(X)</math> such that
:<math>H^k(X)=((X-
Evaluating at <math>X=
:<math>H^{k+1}(X)
=\frac1{X-
=\frac1{X-
</math>
which is the unnormalized iteration for the polynomial sequence <math>(H^k(
:<math>LC(H^{k+1})=-Q(s_k)
=-\frac{H^k(s_k)}{P(s_k)}
=-LC(H^k)\,\tfrac{\bar H^k(s_k)}{P(s_k)}\,,
</math>
such that the iteration for the normalized ''H'' polynomials reads as
:<math>\bar H^{k+1}(X)
=\frac1{X-s_k}\,(P(X)-H^k(X)/Q(s_k))
=\frac1{X-s_k}\,\left(P(X)-\tfrac{P(s_k)}{\bar H^k(s_k)}\,\bar H^k(X)\right)\,.
</math>
▲which is the unnormalized iteration for the polynomial sequence <math>(H^k(s))_{k\in\N}</math>. Comparing the leading coefficients (LC) of this sequence one finds that
:<math>LC(H^{k+1})=-\frac{H^k(s)}{P(s)}=-LC(H^k)\,\frac{\bar H^k(s)}{P(s)}\,,</math>▼
▲:<math>s-\tfrac{P(s)}{\bar H^k(s_k)}</math>
In the additional evaluation of the third stage one only needs the fraction of the leading coefficients,
:<math>
=-\tfrac{H^{k+1}(s_k)}{P(s_k)}
▲
from where the update formula
:<math>s_{k+1}
=s_k+\tfrac{LC(H^{k+1}(X))}{LC(H_1^{k+1}(X))}
=s_k-\tfrac{P(s_k)}{\bar H^{k+1}(s_k)}
</math> results.
==Real coefficients==
|