Content deleted Content added
m →Generalization: Finding repeated elements: Added 1 doi to a journal cite using AWB (10213) |
Undid revision 1264526708 by 37.244.251.1 (talk) uncorrect. More than one time is more than n/n times, not more than n/2 times. |
||
(27 intermediate revisions by 16 users not shown) | |||
Line 9:
| pages = 244–253
| title = [[Symposium on Theory of Computing|Proc. 22nd ACM Symposium on Theory of Computing]]
| year = 1990
| doi-access = free
}}.</ref>
Several lower bounds in computational complexity are proved by reducing the element distinctness problem to the problem in question, i.e., by demonstrating that the solution of the element uniqueness problem may be quickly found after solving the problem in question.▼
==Decision tree complexity==
The number of comparisons needed to solve the problem of size <math>n</math>, in a comparison-based model of computation such as a [[decision tree]] or [[algebraic decision tree]], is <math>\Theta(n\log n)</math>. Here, <math>\Theta</math> invokes [[Big O notation|big theta notation]], meaning that the problem can be solved in a number of comparisons proportional to <math>n\log n</math> (a [[linearithmic function]]) and that all solutions require this many comparisons.<ref>{{citation|first=Michael|last=Ben-Or|contribution=Lower bounds for algebraic computation trees|title=[[Symposium on Theory of Computing|Proc. 15th ACM Symposium on Theory of Computing]]|year=1983|pages=80–86|doi=10.1145/800061.808735|doi-access=free}}.</ref> In these models of computation, the input numbers may not be used to index the computer's memory (as in the hash table solution) but may only be accessed by computing and comparing simple algebraic functions of their values. For these models, an algorithm based on [[comparison sort]] solves the problem within a constant factor of the best possible number of comparisons. The same lower bound applies as well to the [[expected value|expected number]] of comparisons in the [[randomized complexity|randomized]] [[algebraic decision tree]] model.<ref>{{citation|doi=10.1007/BF01270387|title=A lower bound for randomized algebraic decision trees|year=1996|last1=Grigoriev|first1=Dima|authorlink=Dima Grigoriev|journal=Computational Complexity|volume=6|pages=357|last2=Karpinski|first2=Marek|author2-link=Marek Karpinski|last3=Heide|first3=Friedhelm Meyer|last4=Smolensky|first4=Roman|issue=4|s2cid=1462184 }}.</ref><ref>{{citation|doi=10.1007/s000370050002|title=Complexity lower bounds for randomized computation trees over zero characteristic fields|year=1999|last=Grigoriev|first=Dima|authorlink=Dima Grigoriev|journal=Computational Complexity|volume=8|pages=316–329|issue=4|s2cid=10641238 }}.</ref>
==Real RAM Complexity==
If the elements in the problem are [[real number|real numbers]], the decision-tree lower bound extends to the [[real RAM|real random-access machine]] model with an instruction set that includes addition, subtraction and multiplication of real numbers, as well as comparison and either division or remaindering ("floor").<ref>{{citation|doi=10.1137/S0097539797329397|title=Topological Lower Bounds on Algebraic Random Access Machines|year=2001|last1=Ben-Amram|first1=Amir M.|journal=SIAM Journal on Computing|volume=31|issue=3|pages=722–761|last2=Galil|first2=Zvi|author2-link=Zvi Galil}}.</ref> It follows that the problem's complexity in this model is also <math>\Theta(n\log n)</math>. This RAM model covers more algorithms than the algebraic decision-tree model, as it encompasses algorithms that use indexing into tables. However, in this model all program steps are counted, not just decisions.
==Turing Machine complexity==
It is also known that [[quantum algorithm]]s can solve this problem faster in <math>\Theta\left(n^{2/3}\right)</math> queries. The optimal algorithm is by [[Andris Ambainis]].<ref>{{citation | last1=Ambainis | first1=Andris | author1-link=Andris Ambainis | title=Quantum walk algorithm for element distinctness | doi = 10.1137/S0097539705447311 | journal = [[SIAM Journal on Computing]] | volume = 37 | issue = 1 | year = 2007 | pages=210–239 | arxiv=quant-ph/0311001 }}</ref> [[Scott Aaronson]] and Yaoyun Shi first proved a tight lower bound for a large but restricted class of functions.<ref>▼
A single-tape deterministic [[Turing machine]] can solve the problem, for ''n'' elements of {{math|''m'' ≥ log ''n''}} bits each, in time {{math|''O''(''n''<sup>2</sup>''m''(''m''+2–log ''n''))}},
while on a nondeterministic machine the time complexity is {{math|''O''(''nm''(''n'' + log ''m''))}}.<ref>{{citation|doi=10.1007/s00236-003-0125-8|title=Element distinctness on one-tape Turing machines: a complete solution.|year=2003|last1=Ben-Amram|first1=Amir M.|journal=Acta Informatica|volume=40|issue=2|pages=81–94|last2=Berkman|first2=Omer|last3=Petersen|first3=Holger|s2cid=24821585 }}</ref>
| last1=Aaronson | first1=S.▼
==Quantum complexity==
| year=2004▼
▲
{{cite conference
| title=Quantum lower bounds for the collision and the element distinctness problems
| conference = Proceedings of the 43rd [[Symposium on Foundations of Computer Science]]
| arxiv = quant-ph/0112086
| doi=10.1109/SFCS.2002.1181975
}}</ref> Ambainis<ref>
{{cite journal
Line 36 ⟶ 45:
|bibcode=
|doi=10.4086/toc.2005.v001a003
|doi-access=free
}}</ref> and Kutin<ref>
{{cite journal
|last1=Kutin |first1=S.
Line 46 ⟶ 56:
|bibcode=
|doi=10.4086/toc.2005.v001a002
|doi-access=free
}}</ref> independently (and via different proofs) extended
▲Several lower bounds in computational complexity are proved by reducing the element distinctness problem to the problem in question, i.e., by demonstrating that the solution of the element uniqueness problem may be quickly found after solving the problem in question.
==Generalization: Finding repeated elements==
{{main|Misra–Gries heavy hitters algorithm}}
Elements that occur more than
==See also==
Line 66 ⟶ 69:
{{reflist}}
[[Category:Polynomial-time problems]]
|