Steffensen's method: Difference between revisions

Content deleted Content added
m comment on convergence
Line 12:
:<math>g(x_n) = \frac{f(x_n + f(x_n)) - f(x_n)}{f(x_n)}</math>
 
The function <math>g</math> is the average slope of the function &fnof; between the last sequence point <math>(x,y)=( x_n,\ f(x_n) )</math> and the auxilliary point <math>(x,y)=( x_n + h,\ f(x_n + h) )</math>, with the step <math>h=f(x_n)\ </math>&nbsp;. It is only for the purpose of finding this auxilliary point that the value of the function <math>f</math> must be an adequate itteration function for its own solution,correction to get quickcloser convergenceto forits theown sequence of values <math>x_n</math>&nbsp;solution. For all other parts of the calculation, Steffensen's method only requires the function <math>f</math> to be continuous, and to actually have a nearby solution. Several modest modifications of the step <math>h</math> in the slope caclulation <math>g</math> exist to accomodate functions <math>f</math> that do not quite meet this requirement.
 
The main advantage of Steffensen's method is that it can find the roots of an equation <math>f</math> just as "[[quadratic convergence|quickly]]" as [[Newton's method]] but the formula does not require a separate function for the derivative, so it can be programmed for any generic function. In this case ''[[quadratic convergence|quicly]]'' means that the number of correct digits in the answer doubles with each step. The cost for the quick convergence is the double function evaluation: both <math>f(x_n)</math> and <math>f(x_n + f(x_n))</math> must be calculated, which might be time-consuming if <math>f</math> is a complicated function.