Content deleted Content added
Jasper Deng (talk | contribs) →Behaviour: remove second person |
|||
(6 intermediate revisions by 4 users not shown) | |||
Line 1:
{{Short description|Method of solving equations}}
In [[numerical analysis]], '''inverse quadratic interpolation''' is a [[root-finding algorithm]], meaning that it is an algorithm for solving equations of the form ''f''(''x'') = 0. The idea is to use [[polynomial interpolation|quadratic interpolation]] to approximate the [[inverse function|inverse]] of ''f''. This algorithm is rarely used on its own, but it is important because it forms part of the popular [[Brent's method]].
Line 17 ⟶ 18:
:<math> f^{-1}(y) = \frac{(y-f_{n-1})(y-f_n)}{(f_{n-2}-f_{n-1})(f_{n-2}-f_n)} x_{n-2} + \frac{(y-f_{n-2})(y-f_n)}{(f_{n-1}-f_{n-2})(f_{n-1}-f_n)} x_{n-1} </math>
We are looking for a root of ''f'', so we substitute ''y'' = ''f''(''x'') = 0 in the above equation, and this results in the above recursion formula.
==Behaviour==
Line 25 ⟶ 26:
The asymptotic behaviour is very good: generally, the iterates ''x''<sub>''n''</sub> converge fast to the root once they get close. However, performance is often quite poor if the initial values are not close to the actual root. For instance, if by any chance two of the function values ''f''<sub>''n''−2</sub>, ''f''<sub>''n''−1</sub> and ''f''<sub>''n''</sub> coincide, the algorithm fails completely. Thus, inverse quadratic interpolation is seldom used as a stand-alone algorithm.
The order of this convergence is approximately 1.
==Comparison with other root-finding methods==
Line 39 ⟶ 40:
==References==
*[[James F. Epperson]], [https://books.google.com/books?id=Mp8-z5mHptcC
{{root-finding algorithms}}
[[Category:Root-finding algorithms]]
|