Nth root algorithm: Difference between revisions

Content deleted Content added
No edit summary
 
(31 intermediate revisions by 25 users not shown)
Line 1:
#REDIRECT [[Nth root#Computing principal roots]]
{{DISPLAYTITLE:''n''th root algorithm}}
 
{{Redirect category shell|
The [[Principal branch|principal]] [[nth root|''n''th root]] <math>\sqrt[n]{A}</math> of a [[negative and positive numbers|positive]] [[real number]] ''A'', is the positive real solution of the equation
{{R to section}}
 
}}
:<math>x^n = A</math>
 
(for integer ''n'' there are ''n'' distinct [[complex number|complex]] solutions to this equation if <math>A > 0</math>, but only one is positive and real).
 
There is a very fast-[[Limit of a sequence|converging]] ''' ''n''th root algorithm''' for finding <math>\sqrt[n]{A}</math>:
#Make an initial guess <math>x_0</math>
#Set <math>x_{k+1} = \frac{1}{n} \left[{(n-1)x_k +\frac{A}{x_k^{n-1}}}\right]</math>. In practice we do <math> \Delta x_k = \frac{1}{n} \left[{\frac{A}{x_k^{n-1}}} - x_k\right]; x_{k+1} = x_{k} + \Delta x_k </math>.
#Repeat step 2 until the desired precision is reached, i.e. <math> | \Delta x_k | < \epsilon</math> .
 
A special case is the familiar [[Methods_of_computing_square_roots#Babylonian_method|square-root algorithm]]. By setting ''n'' = 2, the ''iteration rule'' in step 2 becomes the square root iteration rule:
:<math>x_{k+1} = \frac{1}{2}\left(x_k + \frac{A}{x_k}\right)</math>
 
Several different derivations of this algorithm are possible. One derivation shows it is a special case of [[Newton's method]] (also called the Newton-Raphson method) for finding zeros of a function <math>f(x)</math> beginning with an initial guess. Although Newton's method is iterative, meaning it approaches the solution through a series of increasingly accurate guesses, it converges very quickly. The rate of convergence is quadratic, meaning roughly that the number of bits of accuracy doubles on each iteration (so improving a guess from 1 bit to 64 bits of precision requires only 6 iterations). For this reason, this algorithm is often used in computers as a very fast method to calculate square roots.
 
For large ''n'', the ''n''<sup>th</sup> root algorithm is somewhat less efficient since it requires the computation of <math>x_k^{n-1}</math> at each step, but can be efficiently implemented with a good [[exponentiation]] algorithm.
 
== Derivation from Newton's method ==
 
[[Newton's method]] is a method for finding a zero of a function ''f(x)''. The general iteration scheme is:
 
#Make an initial guess <math>x_0</math>
#Set <math>x_{k+1} = x_k - \frac{f(x_k)}{f'(x_k)}</math>
#Repeat step 2 until the desired precision is reached.
 
The ''n''<sup>th</sup> root problem can be viewed as searching for a zero of the function
 
:<math>f(x) = x^n - A</math>
 
So the derivative is
 
:<math>f^\prime(x) = n x^{n-1}</math>
 
and the iteration rule is
 
:<math>x_{k+1} = x_k - \frac{f(x_k)}{f'(x_k)}</math>
:<math> = x_k - \frac{x_k^n - A}{n x_k^{n-1}}</math>
:<math> = x_k - \frac{x_k}{n}+\frac{A}{n x_k^{n-1}}</math>
:<math> = \frac{1}{n} \left[{(n-1)x_k +\frac{A}{x_k^{n-1}}}\right]</math>
 
leading to the general ''n''<sup>th</sup> root algorithm.
 
==References==
*{{Citation |first=Kendall E. |last=Atkinson |title=An introduction to numerical analysis |___location=New York |publisher=Wiley |year=1989 |edition=2nd |isbn=0-471-62489-6 }}.
 
[[Category:Root-finding algorithms]]