Content deleted Content added
Citation bot (talk | contribs) Alter: journal, pages, issue. Add: doi, s2cid. Formatted dashes. | Use this bot. Report bugs. | Suggested by Headbomb | #UCB_toolbar |
Alter: template type, journal. Add: isbn, chapter, title. | Use this tool. Report bugs. | #UCB_Gadget |
||
Line 22:
== History ==
Disjoint-set forests were first described by [[Bernard A. Galler]] and [[Michael J. Fischer]] in 1964.<ref name="Galler1964">{{cite journal|first1=Bernard A.|last1=Galler|author1-link=Bernard A. Galler|first2=Michael J.|last2=Fischer|author2-link=Michael J. Fischer|title=An improved equivalence algorithm|journal=[[Communications of the ACM]]|volume=7|issue=5|date=May 1964|pages=301–303|doi=10.1145/364099.364331|s2cid=9034016 |doi-access=free}}. The paper originating disjoint-set forests.</ref> In 1973, their time complexity was bounded to <math>O(\log^{*}(n))</math>, the [[iterated logarithm]] of <math>n</math>, by [[John Hopcroft|Hopcroft]] and [[Jeffrey Ullman|Ullman]].<ref name="Hopcroft1973">{{cite journal|last1=Hopcroft|first1=J. E.|author1-link=John Hopcroft|last2=Ullman|first2=J. D.|author2-link=Jeffrey Ullman|year=1973|title=Set Merging Algorithms|journal=SIAM Journal on Computing|volume=2|issue=4|pages=294–303|doi=10.1137/0202024}}</ref> In 1975, [[Robert Tarjan]] was the first to prove the <math>O(m\alpha(n))</math> ([[Ackermann function#Inverse|inverse Ackermann function]]) upper bound on the algorithm's time complexity,<ref name="Tarjan1984">{{cite journal|first1=Robert E.|last1=Tarjan|author1-link=Robert E. Tarjan|first2=Jan|last2=van Leeuwen|author2-link=Jan van Leeuwen|title=Worst-case analysis of set union algorithms|journal=Journal of the ACM|volume=31|issue=2|pages=245–281|year=1984|doi= 10.1145/62.2160|s2cid=5363073 |doi-access=free}}</ref>. He also proved it to be tight. In 1979, he showed that this was the lower bound for a certain class of algorithms, that include the Galler-Fischer structure.<ref name="Tarjan1979">{{cite journal|first=Robert Endre|last=Tarjan|author-link=Robert E. Tarjan|year=1979|title=A class of algorithms which require non-linear time to maintain disjoint sets|journal=Journal of Computer and System Sciences|volume=18|issue=2|pages=110–127|doi=10.1016/0022-0000(79)90042-4|doi-access=free }}</ref> In 1989, [[Michael Fredman|Fredman]] and [[Michael Saks (mathematician)|Saks]] showed that <math>\Omega(\alpha(n))</math> (amortized) words of <math>O(\log n)</math> bits must be accessed by ''any'' disjoint-set data structure per operation,<ref name="Fredman1989">{{cite
In 1991, Galil and Italiano published a survey of data structures for disjoint-sets.<ref name="Galil1991">{{cite journal|first1=Z.|last1=Galil|first2=G.|last2=Italiano|title=Data structures and algorithms for disjoint set union problems
In 1994, Richard J. Anderson and Heather Woll described a parallelized version of Union–Find that never needs to block.<ref name="Anderson1994">{{cite conference|first1=Richard J.|last1=Anderson|first2=Heather|last2=Woll|title=Wait-free Parallel Algorithms for the Union-Find Problem|conference=23rd ACM Symposium on Theory of Computing|year=1994|pages=370–380}}</ref>
Line 247:
The worst-case time of the <code>Find</code> operation in trees with '''Union by rank''' or '''Union by weight''' is <math>\Theta(\log n)</math> (i.e., it is <math>O(\log n)</math> and this bound is tight).
In 1985, N. Blum gave an implementation of the operations that does not use path compression, but compresses trees during <math>union</math>. His implementation runs in <math>O(\log n / \log\log n)</math> time per operation,<ref>{{cite journal |last1=Blum |first1=Norbert |title=On the Single-Operation Worst-Case Time Complexity of the Disjoint Set Union Problem |journal=2nd Symp. On Theoretical Aspects of Computer Science |date=1985 |pages=32–38}}</ref> and thus in comparison with Galler and Fischer's structure it has a better worst-case time per operation, but inferior amortized time. In 1999, Alstrup et al. gave a structure that has optimal worst-case
time <math>O(\log n / \log\log n)</math> together with inverse-Ackermann amortized time.<ref>{{cite
===Deletion===
The regular implementation as disjoint-set forests does not react favorably to the deletion of elements,
in the sense that the time for <code>Find</code> will not improve as a result of the decrease in the number of elements. However, there exist modern implementations that allow for constant-time deletion and where the time-bound for <code>Find</code> depends on the ''current'' number of elements<ref>{{cite journal |last1=Alstrup |first1=Stephen |last2=Thorup |first2=Mikkel |last3=Gørtz |first3=Inge Li |last4=Rauhe |first4=Theis |last5=Zwick |first5=Uri |title=Union-Find with Constant Time Deletions |journal= ACM
== Applications ==
|