Content deleted Content added
Tom Lougheed (talk | contribs) |
Tom Lougheed (talk | contribs) m comment on convergence |
||
Line 12:
:<math>g(x_n) = \frac{f(x_n + f(x_n)) - f(x_n)}{f(x_n)}</math>
The function <math>g</math> is the average slope of the function ƒ between the last sequence point <math>(x,y)=( x_n,\ f(x_n) )</math> and the auxilliary point <math>(x,y)=( x_n + h,\ f(x_n + h) )</math>, with the step <math>h=f(x_n)\ </math> . It is only for the purpose of finding this auxilliary point that the value of the function <math>f</math> must be an adequate
The main advantage of Steffensen's method is that it can find the roots of an equation <math>f</math> just as "[[quadratic convergence|quickly]]" as [[Newton's method]] but the formula does not require a separate function for the derivative, so it can be programmed for any generic function. In this case ''[[quadratic convergence|quicly]]'' means that the number of correct digits in the answer doubles with each step. The cost for the quick convergence is the double function evaluation: both <math>f(x_n)</math> and <math>f(x_n + f(x_n))</math> must be calculated, which might be time-consuming if <math>f</math> is a complicated function.
|