Content deleted Content added
m →Building a heap: Made the average time complexity of building a heap more clear to readers. Tag: Reverted |
Dennis Brown (talk | contribs) Revert, you broke the citation note |
||
Line 154:
==Building a heap==
Building a heap from an array of {{mvar|n}} input elements can be done by starting with an empty heap, then successively inserting each element. This approach, called Williams' method after the inventor of binary heaps, is easily seen to run in {{math|''O''(''n'' log ''n'')}} time: it performs {{mvar|n}} insertions at {{math|''O''(log ''n'')}} cost each.
However, Williams' method is suboptimal. A faster method (due to [[Robert W. Floyd|Floyd]]{{r|heapbuildjalg}}) starts by arbitrarily putting the elements on a binary tree, respecting the shape property (the tree could be represented by an array, see below). Then starting from the lowest level and moving upwards, sift the root of each subtree downward as in the deletion algorithm until the heap property is restored. More specifically if all the subtrees starting at some height <math>h</math> have already been "heapified" (the bottommost level corresponding to <math>h=0</math>), the trees at height <math>h+1</math> can be heapified by sending their root down along the path of maximum valued children when building a max-heap, or minimum valued children when building a min-heap. This process takes <math>O(h)</math> operations (swaps) per node. In this method most of the heapification takes place in the lower levels. Since the height of the heap is <math> \lfloor \log n \rfloor</math>, the number of nodes at height <math>h</math> is <math>\le \frac{2^{\lfloor \log n \rfloor}}{2^h} \le \frac{n}{2^h}</math>. Therefore, the cost of heapifying all subtrees is:
Line 167:
</math>
The exact value of the above (the worst-case number of comparisons during the heap construction) is known to be equal to:
|