Randomized algorithm: Difference between revisions

Content deleted Content added
Quicksort: The claim was false, it runs in O(n log n) regardless of the input when using a linear implementation of the median
Monkbot (talk | contribs)
m Task 18 (cosmetic): eval 11 templates: del empty params (1×); hyphenate params (1×);
Line 4:
A '''randomized algorithm''' is an [[algorithm]] that employs a degree of [[randomness]] as part of its logic. The algorithm typically uses [[Uniform distribution (discrete)|uniformly random]] bits as an auxiliary input to guide its behavior, in the hope of achieving good performance in the "average case" over all possible choices of random bits. Formally, the algorithm's performance will be a [[random variable]] determined by the random bits; thus either the running time, or the output (or both) are random variables.
 
One has to distinguish between algorithms that use the random input so that they always terminate with the correct answer, but where the expected running time is finite ([[Las Vegas algorithm]]s, for example [[Quicksort]]<ref>{{Cite journal|last=Hoare|first=C. A. R.|date=July 1961|title=Algorithm 64: Quicksort|journal=Commun. ACM|volume=4|issue=7|pages=321–|doi=10.1145/366622.366644|issn=0001-0782}}</ref>), and algorithms which have a chance of producing an incorrect result ([[Monte Carlo algorithm]]s, for example the Monte Carlo algorithm for the [[Minimum feedback arc set|MFAS]] problem<ref>{{Cite journal|last=Kudelić|first=Robert|date=2016-04-01|title=Monte-Carlo randomized algorithm for minimal feedback arc set problem|url=|journal=Applied Soft Computing|volume=41|pages=235–246|doi=10.1016/j.asoc.2015.12.018}}</ref>) or fail to produce a result either by signaling a failure or failing to terminate. In some cases, probabilistic algorithms are the only practical means of solving a problem.<ref>"In [[primality test|testing primality]] of very large numbers chosen at random, the chance of stumbling upon a value that fools the [[Fermat primality test|Fermat test]] is less than the chance that [[cosmic radiation]] will cause the computer to make an error in carrying out a 'correct' algorithm. Considering an algorithm to be inadequate for the first reason but not for the second illustrates the difference between mathematics and engineering." [[Hal Abelson]] and [[Gerald J. Sussman]] (1996). ''[[Structure and Interpretation of Computer Programs]]''. [[MIT Press]], [http://mitpress.mit.edu/sicp/full-text/book/book-Z-H-11.html#footnote_Temp_80 section 1.2].</ref>
 
In common practice, randomized algorithms are approximated using a [[pseudorandom number generator]] in place of a true source of random bits; such an implementation may deviate from the expected theoretical behavior.
Line 161:
* In [[communication complexity]], the equality of two strings can be verified to some reliability using <math>\log n</math> bits of communication with a randomized protocol. Any deterministic protocol requires <math>\Theta(n)</math> bits if defending against a strong opponent.<ref>{{citation|title=Communication Complexity|first1=Eyal|last1=Kushilevitz|first2=Noam|last2=Nisan|publisher=Cambridge University Press|year=2006|isbn=9780521029834}}. For the deterministic lower bound see p.&nbsp;11; for the logarithmic randomized upper bound see pp.&nbsp;31–32.</ref>
* The volume of a convex body can be estimated by a randomized algorithm to arbitrary precision in polynomial time.<ref>{{citation|last1=Dyer|first1=M.|last2=Frieze|first2=A.|last3=Kannan|first3=R.|title=A random polynomial-time algorithm for approximating the volume of convex bodies|journal=[[Journal of the ACM]]|volume=38|issue=1|year=1991|pages=1–17|doi=10.1145/102782.102783|url=http://www.math.cmu.edu/~af1p/Texfiles/oldvolume.pdf}}</ref> [[Imre Bárány|Bárány]] and [[Zoltán Füredi|Füredi]] showed that no deterministic algorithm can do the same.<ref>{{citation|last1=Füredi|first1=Z.|author1-link=Zoltán Füredi|last2=Bárány|first2=I.|year=1986|contribution=Computing the volume is difficult|title=Proc. 18th ACM Symposium on Theory of Computing (Berkeley, California, May 28–30, 1986)|publisher=ACM|___location=New York, NY|pages=442–447|doi=10.1145/12130.12176|citeseerx=10.1.1.726.9448|isbn=0-89791-193-8 |url=https://ecommons.cornell.edu/bitstream/1813/8572/1/TR000688.pdf}}</ref> This is true unconditionally, i.e. without relying on any complexity-theoretic assumptions, assuming the convex body can be queried only as a black box.
* A more complexity-theoretic example of a place where randomness appears to help is the class [[IP (complexity)|IP]]. IP consists of all languages that can be accepted (with high probability) by a polynomially long interaction between an all-powerful prover and a verifier that implements a BPP algorithm. IP = [[PSPACE]].<ref>{{citation|last=Shamir|first=A.|authorlinkauthor-link=Adi Shamir|title=IP = PSPACE|journal=Journal of the ACM|volume=39|issue=4|year=1992|pages=869–877|doi=10.1145/146585.146609}}</ref> However, if it is required that the verifier be deterministic, then IP = [[NP (complexity)|NP]].
* In a [[chemical reaction network]] (a finite set of reactions like A+B → 2C + D operating on a finite number of molecules), the ability to ever reach a given target state from an initial state is decidable, while even approximating the probability of ever reaching a given target state (using the standard concentration-based probability for which reaction will occur next) is undecidable. More specifically, a limited Turing machine <!-- the Turing machine has infinite tape --> can be simulated with arbitrarily high probability of running correctly for all time, only if a random chemical reaction network is used. With a simple nondeterministic chemical reaction network (any possible reaction can happen next), the computational power is limited to [[Primitive recursive|primitive recursive functions]].<ref>{{citation
| last1 = Cook | first1 = Matthew | author1-link = Matthew Cook