Galactic algorithm: Difference between revisions

Content deleted Content added
AnomieBOT (talk | contribs)
m Dating maintenance tags: {{Cn}}
Revert a number of good-faith edits. All but one of these objections are covered in seminal paper; re-wording fixes the other. See talk
Line 1:
{{Short description|Classification of algorithm}}
A '''galactic algorithm''' is one with world-beating theoretical (asymptotic) performance, but is never used in practice. Typical reasons are that outperformsthe performance gains otheronly algorithmsappear for problems that are sufficientlyso large, butthey wherenever "[[sufficientlyoccur, large]]"or isthe soalgorithm's bigcomplexity thatoutweighs thea algorithmrelatively issmall never usedgain in practiceperformance. Galactic algorithms were so named by [[Richard Lipton]] and Ken Regan,<ref name="seminal">{{cite book |last1=Lipton |first1=Richard J. |author-link1=Richard Lipton|first2=Kenneth W. |last2=Regan |chapter=David Johnson: Galactic Algorithms |title=People, Problems, and Proofs: Essays from Gödel's Lost Letter: 2010 |publisher=Springer Berlin |___location=Heidelberg |year=2013 |pages=109–112 |chapter-url=https://books.google.com/books?id=eLC9BAAAQBAJ&pg=PA109 |isbn=9783642414220}}</ref> because they will never be used on any data sets on Earth.
{{original research|date=November 2023}}
A '''galactic algorithm''' is one that outperforms other algorithms for problems that are sufficiently large, but where "[[sufficiently large]]" is so big that the algorithm is never used in practice. Galactic algorithms were so named by [[Richard Lipton]] and Ken Regan,<ref name="seminal">{{cite book |last1=Lipton |first1=Richard J. |author-link1=Richard Lipton|first2=Kenneth W. |last2=Regan |chapter=David Johnson: Galactic Algorithms |title=People, Problems, and Proofs: Essays from Gödel's Lost Letter: 2010 |publisher=Springer Berlin |___location=Heidelberg |year=2013 |pages=109–112 |chapter-url=https://books.google.com/books?id=eLC9BAAAQBAJ&pg=PA109 |isbn=9783642414220}}</ref> because they will never be used on any data sets on Earth.
 
== Possible use cases ==
Even if they are never used in practice, galactic algorithms may still contribute to computer science:
* An algorithm, even if impractical, may show new techniques that may eventually be used to create practical algorithms.{{cn|date=November 2023}}
* Available computational power may catch up to the crossover point, so that a previously impractical algorithm becomes practical.{{cn|date=November 2023}}
* An impractical algorithm can still demonstrate that conjectured bounds can be achieved, or that proposed bounds are wrong, and hence advance the theory of algorithms. As Lipton states:<ref name="seminal"/>{{quote |This alone could be important and often is a great reason for finding such algorithms. For example, if tomorrow there were a discovery that showed there is a factoring algorithm with a huge but provably polynomial time bound, that would change our beliefs about factoring. The algorithm might never be used, but would certainly shape the future research into factoring.}} Similarly, a hypothetical large but polynomial <math>O\bigl(n^{2^{100}}\bigr)</math> algorithm for the [[Boolean satisfiability problem]], although unusable in practice, would settle the [[P versus NP problem]], considered the most important open problem in computer science and one of the [[Millennium Prize Problems]].<ref>{{cite journal |author=Fortnow, L. |year=2009 |title=The status of the P versus NP problem |journal=Communications of the ACM |volume=52 |issue=9 |pages=78–86 |url=http://people.cs.uchicago.edu/~fortnow/papers/pnp-cacm.pdf|doi=10.1145/1562164.1562186 |s2cid=5969255 }}</ref>
 
== Examples ==
Line 32 ⟶ 31:
=== Cryptographic breaks ===
For cryptographers, a cryptographic "break" is anything faster than a brute-force attack – i.e., performing one trial decryption for each possible key. In many cases, even though they are the best known methods, they are still infeasible with current technology. One example is the best attack known against 128-bit [[Advanced Encryption Standard|AES]], which takes only <math>2^{126}</math> operations.<ref name=":0">{{cite book |author=Biaoshuai Tao |title = Information Security and Privacy|volume = 9144|pages = 39–56|author2=Hongjun Wu |name-list-style=amp |date=2015|doi=10.1007/978-3-319-19962-7_3 |series = Lecture Notes in Computer Science|isbn = 978-3-319-19961-0}}</ref> Despite being impractical, theoretical breaks can sometimes provide insight into vulnerability patterns.
 
=== Traveling salesman problem ===
For several decades, the best known approximation to the [[traveling salesman problem]] in a [[metric space]] was the very simple [[Christofides algorithm]] which produced a path at most 50% longer than the optimum. (Many other algorithms could ''usually'' do much better, but could not provably do so.) In 2020, a newer and much more complex algorithm was discovered that can beat this by <math>10^{-34}</math> percent.<ref>{{cite arXiv |eprint=2007.01409 |class=cs.DS |title=A (Slightly) Improved Approximation Algorithm for Metric TSP |author1=Anna R. Karlin |author2=Nathan Klein |author3=Shayan Oveis Gharan |date=September 1, 2020}}</ref> Although no one will ever switch to this algorithm for its very slight worst-case improvement, it is still considered important because "this minuscule improvement breaks through both a theoretical logjam and a psychological one".<ref>{{cite web |author=Klarreich |first=Erica |date=8 October 2020 |title=Computer Scientists Break Traveling Salesperson Record |url=https://www.quantamagazine.org/computer-scientists-break-traveling-salesperson-record-20201008/ |website=Quanta Magazine}}</ref>
 
=== Hutter search ===