Newton's method: Difference between revisions

Content deleted Content added
revert; the old version was clearer
Line 5:
 
The idea of the method is as follows: one starts with an initial estimate which is reasonably close to the true root, then replaces the function by its [[tangent]] (which can be computed using the tools of [[calculus]]) and computes the x-intercept of this tangent (which is easily done with elementary algebra). This x-intercept will typically be a better approximation to the function's root, and the method can be [[iterative method|iterated]].
 
[[Image:newton_iteration.png|alt Illustration of Newton's method|thumb|right|300px|An illustration of one iteration of Newton's method. We see that <math>x_{n+1}</math> is a better approximation than <math>x_n</math> for the zero <math>x</math> of the function <math>f</math>.]]
 
 
Suppose ''f'' : [''a'', ''b''] → '''R''' is a [[derivative|differentiable]] function defined on the [[interval (mathematics)|interval]] [''a'', ''b''] with values in the [[real number]]s '''R'''.
The formula for converging on the root can be easily derived. Suppose we have some current approximation ''x''<sub>n</sub>. Then we can derive the formula for a better approximation, ''x''<sub>n+1</sub> by referring to the diagram below.
We know from the definition of the [[derivative]] (''f''&nbsp;') at a given point that it is the slope of a tangent at that point. Within a linear approximation the tangent
 
:<math>t(x_{n}+\delta x) = f( x_{n} )+f'(x_{n})\delta x</math>.
[[Image:newton_iteration.png|alt Illustration of Newton's method|thumb|right|300px|An illustration of one iteration of Newton's method. We see that <math>x_{n+1}</math> is a better approximation than <math>x_n</math> for the zero <math>x</math> of the function <math>f</math>.]]
By simple algebra we can derive where (<math>x_{n+1}=x_{n}+\delta x</math>)
That is
the tangent intersects the horizontal axis (<math>t=0</math>):
:<math>f'(x_{n}) = \frac{ \mathrm{rise} }{ \mathrm{run} } = \frac{ f( x_{n} ) - 0 }{ x_{n} - x_{n+1} } = \frac{0 - f(x_{n})}{(x_{n+1} - x_{n})}\,\!</math>.
Here, ''f''&nbsp;' denotes the [[derivative]] of the function ''f''. Then by simple algebra we can derive
:<math>x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n)}\,\!</math>.
We start the process off with some arbitrary initial value ''x''<sub>0</sub> (the closer to the zero the better — in the absence of any intuition about where the zero might lie, a "guess and check" method might narrow the possibilities to a reasonably small interval by appealing to the [[intermediate value theorem]]). The method will usually converge provided this initial guess is close enough to the unknown zero. Furthermore, for a zero of [[multiplicity]] 1, the [[rate of convergence|convergence is at least quadratic]] in a [[neighbourhood (mathematics)|neighbourhood]] of the zero, which intuitively means that the number of correct digits roughly at least doubles in every step. More details can be found in the [[#Analysis|"Analysis" section]].
Line 175 ⟶ 174:
* [[Integer square root]]
* [[Methods of computing square roots]]
 
* [[Newton's method in optimization]]
[[Category:Optimization algorithms]]
[[Category:Root-finding algorithms]]