Selection algorithm: Difference between revisions

Content deleted Content added
source selection from a heap; COI edit, to make up for which I am dropping the link to my old lecture notes
Line 37:
When data is already organized into a [[data structure]], it may be possible to perform selection in an amount of time that is sublinear in the number of values. As a simple case of this, for data already sorted into an array, selecting the <math>k</math> element may be performed by a single array lookup, in constant time.
 
For data organized as a [[binary heap]], or more generally as a heap-ordered tree (a tree in which each node stores a value and has a bounded number of children, all having larger values), it is possible to perform selection in time <math>O(k)</math>, independent of the size <math>n</math> of the whole tree, and faster than the <math>O(k\log n)</math> time bound that would be obtained from [[best-first search]].{{r|frederickson}} This same method can be applied more generally to data organized as any kind of heap-ordered selectiontree (a tree in which each node stores one value in which the parent of each non-root node has a smaller value than its child). This method of performing selection in a heap has been applied to several problems of listing multiple solutions to combinatorial optimization problems, such as finding the [[k shortest path routing|{{mvar|k}} shortest paths]] in a weighted graph, by defining a [[State space (computer science)|state space]] of solutions in the form of an [[implicit graph|implicitly defined]] heap-ordered tree, and then applying this selection algorithm to this tree.{{r|kpaths}}
 
For a collection of data values undergoing dynamic insertions and deletions, it is possible to augment a [[self-balancing binary search tree]] structure with a constant amount of additional information per tree node, allowing insertions, deletions, and selection queries that ask for the <math>k</math>th element in the current set to all be performed in <math>O(\log n)</math> time per operation.{{r|clrs}}
Line 68:
 
<ref name=clrs>{{Introduction to Algorithms|edition=3|chapter=Chapter 9: Medians and order statistics|pages=213–227}}; "Section 14.1: Dynamic order statistics", pp. 339–345</ref>
 
<ref name=frederickson>{{cite journal
| last = Frederickson | first = Greg N.
| doi = 10.1006/inco.1993.1030
| issue = 2
| journal = [[Information and Computation]]
| mr = 1221889
| pages = 197–214
| title = An optimal algorithm for selection in a min-heap
| volume = 104
| year = 1993}}</ref>
 
<ref name=kpaths>{{cite journal
| last = Eppstein | first = David | author-link = David Eppstein
| doi = 10.1137/S0097539795290477
| issue = 2
| journal = [[SIAM Journal on Computing]]
| mr = 1634364
| pages = 652–673
| title = Finding the <math>k</math> shortest paths
| volume = 28
| year = 1999}}</ref>
 
<ref name=iso>Section 25.3.2 of ISO/IEC 14882:2003(E) and 14882:1998(E)</ref>
Line 83 ⟶ 105:
* [[Thomas H. Cormen]], [[Charles E. Leiserson]], [[Ronald L. Rivest]], and [[Clifford Stein]]. ''[[Introduction to Algorithms]]'', Second Edition. MIT Press and McGraw-Hill, 2001. {{ISBN|0-262-03293-7}}. Chapter 9: Medians and Order Statistics, pp.&nbsp;183&ndash;196. Section 14.1: Dynamic order statistics, pp.&nbsp;302&ndash;308.
{{refend}}
 
==External links==
* "[http://www.ics.uci.edu/~eppstein/161/960125.html Lecture notes for January 25, 1996: Selection and order statistics]", ''ICS 161: Design and Analysis of Algorithms,'' David Eppstein
 
{{DEFAULTSORT:Selection Algorithm}}