Wikipedia:Featured article candidates/Euclidean algorithm/archive1: Difference between revisions

Content deleted Content added
c on re-introduced error
Proteins (talk | contribs)
amended problem noted by CS
Line 144:
**"Integer factorization is thought to be a difficult problem for large numbers." A bit weaselly, and it's not particularly difficult if you have a calculator handy. Perhaps "can be" instead of "is thought to be" ?
:::I clarified the sentence, although perhaps I should have been more clear about "large numbers". A pocket calculator might help in factoring numbers up to 20,000 (5 digits), but it won't be useful in factoring numbers with 500 digits, the rough size of number used in modern cryptography. Nevertheless, the Euclidean algorithm can quickly find the greatest common divisor of two 500-digit numbers. That was the point I was trying to convey. Should I spell that out in the article, do you think? [[User:Proteins|Proteins]] ([[User talk:Proteins|talk]]) 10:32, 4 May 2009 (UTC)
***::::I corrected the statement the first time from asserting "is a difficult problem" to "is thought to be..." since it is "only" thought to be difficult, not proven. The new version which states "The computational difficulty of integer factorization grows exponentially with the size of the number being factored" is a step backwards in that regards. The computational difficulty of integer factorization is in fact unknown. --[[User:C S|C S]] ([[User talk:C S|talk]]) 11:36, 4 May 2009 (UTC)
 
::::You are right that I was inferring too much about the specific scaling of factorization. I've re-worded this sentence to "Factorization of large integers is believed to be such a difficult problem that many modern cryptography systems are based upon it.", which is supported by the Schroeder reference. [[User:Proteins|Proteins]] ([[User talk:Proteins|talk]]) 13:59, 4 May 2009 (UTC)
***I corrected the statement the first time from asserting "is a difficult problem" to "is thought to be..." since it is "only" thought to be difficult, not proven. The new version which states "The computational difficulty of integer factorization grows exponentially with the size of the number being factored" is a step backwards in that regards. The computational difficulty of integer factorization is in fact unknown. --[[User:C S|C S]] ([[User talk:C S|talk]]) 11:36, 4 May 2009 (UTC)
 
**"A more subtle definition of the GCD is helpful in advanced mathematics, particularly ring theory." This statement should probably be accompanied by a ref.