Binary heap: Difference between revisions

Content deleted Content added
Rescuing 1 sources and tagging 0 as dead.) #IABot (v2.0.9.5
added another name for the up-heap operation
Line 39:
#Compare the added element with its parent; if they are in the correct order, stop.
#If not, swap the element with its parent and return to the previous step.
Steps 2 and 3, which restore the heap property by comparing and possibly swapping a node with its parent, are called ''the up-heap'' operation (also known as ''bubble-up'', ''percolate-up'', ''sift-up'', ''trickle-up'', ''swim-up'', ''heapify-up'', ''cascade-up'', or ''cascadefix-up'').
 
The number of operations required depends only on the number of levels the new element must rise to satisfy the heap property. Thus, the insertion operation has a worst-case time complexity of {{nowrap|O(log ''n'')}}. For a random heap, and for repeated insertions, the insertion operation has an average-case complexity of O(1).<ref>{{Cite journal|last1=Porter|first1=Thomas|last2=Simon|first2=Istvan|date=Sep 1975|title=Random insertion into a priority queue structure|journal=IEEE Transactions on Software Engineering|volume=SE-1|issue=3|pages=292–298|doi=10.1109/TSE.1975.6312854|s2cid=18907513|issn=1939-3520}}</ref><ref>{{Cite journal |last1=Mehlhorn|first1=Kurt|last2=Tsakalidis|first2=A.|date=Feb 1989| title=Data structures |website=Universität des Saarlandes |url=https://publikationen.sulb.uni-saarland.de/handle/20.500.11880/26179 |language=en|page=27|publisher=Universität des Saarlandes |doi=10.22028/D291-26123 |quote=Porter and Simon [171] analyzed the average cost of inserting a random element into a random heap in terms of exchanges. They proved that this average is bounded by the constant 1.61. Their proof docs not generalize to sequences of insertions since random insertions into random heaps do not create random heaps. The repeated insertion problem was solved by Bollobas and Simon [27]; they show that the expected number of exchanges is bounded by 1.7645. The worst-case cost of inserts and deletemins was studied by Gonnet and Munro [84]; they give log log n + O(1) and log n + log n* + O(1) bounds for the number of comparisons respectively.}}</ref>