'''Sidi's generalized secant method''' is a [[root-finding algorithm]], that is, a [[numerical method]] for solving [[equations]] of the form <math>f(x)=0</math> . The method was published
by Avraham[[Avram Sidi]].<ref>
Sidi, Avram, "Generalization Of The Secant Method For Nonlinear Equations", Applied Mathematics E-notes '''8''' (2008), 115-123115–123, http://www.math.nthu.edu.tw/~amen/2008/070227-1.pdf
</ref><ref>
The home page of Avraham Sidi at the Israel Institute of Technology is at [http://www.cs.technion.ac.il/people/asidi/ Avraham Sidi]
The method is a generalization of the [[secant method]]. Like the secant method, it is an [[iterative method]] which requires one evaluation of <math>f</math> in each iteration and no [[derivative]]s of <math>f</math>. The method can converge much faster though, with an [[Rate of convergence|order]] which approaches 2 provided that <math>f</math> satisfies the regularity conditions described below.
</ref> It is a generalization of the [[secant method]].
== Algorithm ==
We call <math>\alpha</math> the root of <math>f</math>, that is, <math>f(\alpha)=0</math>. Sidi's method is an [[iterative method]] which generates a [[sequence]] <math>\{ x_i \}</math> of estimatesapproximations of <math>\alpha</math>. Starting with ''k'' + 1 initial estimatesapproximations <math>x_1 , \dots , x_{k+1}</math>, the estimateapproximation <math>x_{k+2}</math> is calculated in the first iteration, the estimateapproximation <math>x_{k+3}</math> is calculated in the second iteration, etc. Each iteration takes as input the last ''k'' + 1 estimatesapproximations and the value of <math>f</math> forat those estimatesapproximations. Hence the ''n''th iteration takes as input the estimates approximations <math>x_n , \dots , x_{n+k}</math> and the values <math>f(x_n) , \dots , f(x_{n+k})</math>.
The number ''k'' must be 1 or larger: ''k'' = 1, 2, 3, .... It remains fixed during the execution of the algorithm. In order to obtain the starting estimatesapproximations <math>x_1 , \dots , x_{k+1}</math> one could carry out a few initializing iterations with a lower value of ''k''.
The estimateapproximation <math>x_{n+k+1}</math> is calculated as follows in the ''n''th iteration. A [[Polynomial interpolation|polynomial of interpolation]] <math>p_{n,k} (x)</math> of [[Degree of a polynomial|degree]] ''k'' is fitted to the ''k'' + 1 points <math>(x_n, f(x_n)), \dots (x_{n+k}, f(x_{n+k}))</math>. With this polynomial, the next approximation <math>x_{n+k+1}</math> of <math>\alpha</math> is calculated as
{{NumBlk|:|<math> x_{n+k+1} = x_{n+k} - \frac{f(x_{n+k})}{p_{n,k}'(x_{n+k})}</math>|{{EquationRef|1}}}}
with <math>p_{n,k}'(x_{n+k})</math> the derivative of <math>p_{n,k}</math> at <math>x_{n+k}</math>. Having calculated <math>x_{n+k+1}</math> one calculates <math>f(x_{n+k+1})</math> and the algorithm can continue with the (''n'' + 1)th iteration. Clearly, this method requires the function <math>f</math> to be evaluated only once per iteration; it requires [[Derivative-free optimization|no derivatives]] of <math>f</math>.
The iterative cycle is stopped if an appropriate stop-stopping criterion is met. Typically the criterion is that the last calculated estimateapproximation is close enough to the sought-after root <math>\alpha</math>.
To execute the algorithm effectively, Sidi's method calculates the interpolating polynomial <math>p_{n,k} (x)</math> in its [[Newton polynomial|Newton form]].
== Convergence ==
Sidi showed that if the function <math>f</math> is (''k'' + 1)-times [[Smooth function|continuously differentiable]] in an [[open interval]] <math>I</math> containing <math>\alpha</math> (that is, <math>f \in C^{k+1} (I)</math>), <math>\alpha</math> is a simple root of <math>f</math> (that is, <math>f'(\alpha) \neq 0</math>) and the initial estimatesapproximations <math>x_1 , \dots , x_{k+1}</math> are chosen close enough to <math>\alpha</math>, then the sequence <math>\{ x_i \}</math> converges to <math>\alpha</math>, (meaning that the following [[Limit of a sequence|limit]] holds: <math>\lim\limits_{n \to \infty} x_n = \alpha</math>).
Sidi furthermore showed that the sequence [[Rate of convergence|converges]] to <math>\alpha</math> of order <math>\psi_k</math>, i.e.
:<math> \lim\limits_lim_{n \to \infty} \frac{|x_{n +1}-\alpha|}{|x_n-\alpha|prod^k_{\psi_ki=0}(x_{n-i}-\alpha)} = L = \mufrac{(-1)^{k+1}} {(k+1)!}\frac{f^{(k+1)}(\alpha)}{f'(\alpha)}, </math>
withand athat the sequence [[rateRate of convergence|converges]] to <math>\mualpha</math> > 0. Theof order of convergence <math>\psi_k</math>, is the [[Descartes's rule of signs|only positive root]] of the polynomiali.e.
:<math> \lim\limits_{n \to \infty} \frac{|x_{n+1}-\alpha|}{|x_n-\alpha|^{\psi_k}} = |L|^{(\psi_k-1)/k} </math>
The order of convergence <math>\psi_k</math> is the [[Descartes's rule of signs|only positive root]] of the polynomial
:<math> s^{k+1} - s^k - s^{k-1} - \dots - s - 1 </math>
We have e.g. <math>\psi_1 = (1+\sqrt{5})/2</math> ≈ 1.6180, <math>\psi_2</math> ≈ 1.8393 and <math>\psi_3</math> ≈ 1.9276. The order approaches 2 from below if ''k'' becomes large: <math> \lim\limits_{k \to \infty} \psi_k = 2</math>
<ref name="mullertraub">
Traub, J.F., "Iterative Methods for the Solution of Equations", Prentice Hall, Englewood Cliffs, N.J. (1964)
Muller, David E., "A Method for Solving Algebraic Equations Using an Automatic Computer," Mathematical Tables and Other Aids to Computation '''10''' (1956), 208-215 ▼
</ref>
<ref name="muller">
▲Muller, David E., "A Method for Solving Algebraic Equations Using an Automatic Computer ," , Mathematical Tables and Other Aids to Computation '''10''' (1956), 208-215208–215
Sidi's method has also been discussed
<ref>
Soleymani F. and Hosseinabadi V., "New Third- and Sixth-Order Derivative-Free Techniques for Nonlinear Equations", Journal of Mathematics Research '''3''' (2011), 107-112
</ref>
in the context of an "efficiency index". This index weighs the convergence order of a method against the number of function evaluations per iterative step, the number being 1 in Sidi's case.
== Related algorithms ==
Sidi's method reduces to the [[secant method]] if we take ''k'' = 1. In this case the polynomial <math>p_{n,1} (x)</math> is the linear approximation of <math>f</math> around <math>\alpha</math> which is used in the ''n''th iteration of the secant method.
We can expect that the larger we choose ''k'', the better <math>p_{n,k} (x)</math> is an approximation of <math>f(x)</math> around <math>x=\alpha</math>. Also, the better <math>p_{n,k}' (x)</math> is an approximation of <math>f'(x)</math> around <math>x=\alpha</math>. If we replace <math>p_{n,k}'</math> with <math>f'</math> in ({{EquationNote|1}}) we obtain that the next estimateapproximation in each iteration is calculated as
{{NumBlk|:|<math> x_{n+k+1} = x_{n+k} - \frac{f(x_{n+k})}{f'(x_{n+k})} </math>|{{EquationRef|2}}}}
This is the [[Newton's method|Newton-RaphsonNewton–Raphson method]]. It starts off with a single estimateapproximation <math>x_1</math> so we can take ''k'' = 0 in ({{EquationNote|2}}). It does not require an interpolating polynomial but instead one has to evaluate the derivative <math>f'</math> in each iteration. Depending on the nature of <math>f</math> this may not be possible or practical.
Once the interpolating polynomial <math>p_{n,k} (x)</math> has been calculated, one can also calculate the next estimateapproximation <math>x_{n+k+1}</math> as a solution of <math>p_{n,k} (x)=0</math> instead of using ({{EquationNote|1}}). For ''k'' = 1 these two methods are identical: it is the [[secant method]]. For ''k'' = 2 this method is known as [[Muller's method]] .<ref name="muller"/>. For ''k'' = 3 this approach involves finding the roots of a [[cubic function]], which is unattractively complicated. This problem becomes worse for even larger values of ''k''. An additional complication is that the equation <math>p_{n,k} (x)=0</math> will in general have [[Properties of polynomial roots|multiple solutions]] and a prescription has to be given which of these solutions is the next estimateapproximation <math>x_{n+k+1}</math>. Muller does this for the case ''k'' = 2 but no such prescriptions appear to exist for ''k '' > 2.
== References ==
<references/>
[[:Category:Root{{root-finding algorithms]]}}
[[Category:Root-finding algorithms]]
<!-- This will add a notice to the bottom of the page and won't blank it! The new template which says that your draft is waiting for a review will appear at the bottom; simply ignore the old (grey) drafted templates and the old (red) decline templates. A bot will update your article submission. Until then, please don't change anything in this text box and press "Save page". -->
{{AFC submission|||ts=20130316135145|u=Mjpnijmeijer|ns=2}}
|