Average-case complexity: Difference between revisions

Content deleted Content added
add article
add leonid Levin quote on motivation for studying this area
Line 2:
 
For [[deterministic algorithm]]s, the '''average-case complexity''' ('''expected time complexity'''), associates a given input distribution with the [[expected value|expected]] time of an algorithm.
 
[[Leonid Levin]] presented the motivation for studying average-case complexity as follows:<ref>[http://www.cs.bu.edu/fac/lnd/research/hard.htm "Intractability Concepts for Concrete Problems", [[Leonid Levin]]]</ref>:
:"Many combinatorial problems (called search or NP problems) have easy methods of checking solutions for correctness. Examples: finding factors of a long integer, or proofs of math theorems or short fast programs generating a given string. Such problems can be stated as a task to invert a given, easy to compute, function (multiplication or extraction of a theorem from its proof). In 1971 I noticed that many such problems can be proven to be as hard as the Tiling problem (which, I knew for a while, was universal, i.e. at least as hard as any search problem)...
:"A common misinterpretation of these results was that all NP-complete problems are hard, no chance for good algorithms. On this basis some such problems generated much hope in cryptography: the adversary would be helpless. Karp and others noticed that this was naive. While worst instances of NP-complete problems defeat our algorithms, such instances may be extremely rare. In fact, fast on average algorithms were found for a great many NP-complete problems. If all NP problems are easy on average, the P=?NP question becomes quite academic. Even if exponentially hard instances exist, those we could ever find might all be easy. Some problems (like factoring) seem hard for typical instances, but nothing is proven at all to support this (crucial, e.g., for cryptography) belief. These issues turned out to be subtle and it was not clear how a theory could distinguish intrinsically hard on average problems. [Levin 86], [Venkatesan, Levin STOC-88], [Impagliazzo, Levin, FOCS-90] proposed such a theory with first average case intractability results. Random (under uniform distribution) instances of some problems are now known to be as hard as random instances of any NP-problem under any samplable distribution."
 
==Literature==
 
The literature of average case complexity includes the following work: