Sorting algorithm: Difference between revisions

Content deleted Content added
Wfunction (talk | contribs)
The Firefox note was rather useless; I added a much more important advantage of mergesort which is more relevant today.
m Fixed formatting of Thorup's algorithm, clarified that all three algorithms are integer-sorting algorithms (not just the third one), and removed final bullet which does not have any citations.
 
Line 1:
{{Short description|Algorithm that arranges lists in order}}
In [[computer science]], a '''sorting algorithm''' is an [[algorithm]] that puts elements of a [[List (computing)|list]] in a certain [[Total order|order]]. The most-used orders are numerical order and [[lexicographical order]]. Efficient [[sorting]] is important for optimizing the use of other algorithms (such as [[search algorithm|search]] and [[merge algorithm|merge]] algorithms) that require sorted lists to work correctly; it is also often useful for [[Canonicalization|canonicalizing]] data and for producing human-readable output. More formally, the output must satisfy two conditions:
 
[[File:Merge sort animation.gif|thumb|right|[[Merge sort]]]]
# The output is in nondecreasing order (each element is no smaller than the previous element according to the desired [[total order]]);
# The output is a [[permutation]] (reordering) of the input.
 
In [[computer science]], a '''sorting algorithm''' is an [[algorithm]] that puts elements of a [[List (computing)|list]] into an [[Total order|order]]. The most frequently used orders are [[numerical order]] and [[lexicographical order]], and either ascending or descending. Efficient [[sorting]] is important for optimizing the [[Algorithmic efficiency|efficiency]] of other algorithms (such as [[search algorithm|search]] and [[merge algorithm|merge]] algorithms) that require input data to be in sorted lists. Sorting is also often useful for [[Canonicalization|canonicalizing]] data and for producing human-readable output.
Since the dawn of computing, the sorting problem has attracted a great deal of research, perhaps due to the complexity of solving it efficiently despite its simple, familiar statement. For example, [[bubble sort]] was analyzed as early as 1956.<ref>Demuth, H. Electronic Data Sorting. PhD thesis, Stanford University, 1956.</ref> Although many consider it a solved problem, useful new sorting algorithms are still being invented (for example, [[library sort]] was first published in 2006). Sorting algorithms are prevalent in introductory computer science classes, where the abundance of algorithms for the problem provides a gentle introduction to a variety of core algorithm concepts, such as [[big O notation]], [[divide and conquer algorithm]]s, [[data structure]]s, [[randomized algorithm]]s, [[best, worst and average case]] analysis, [[time-space tradeoff]]s, and [[upper and lower bounds]].
 
Formally, the output of any sorting algorithm must satisfy two conditions:
== Classification ==<!-- This section is linked from [[Merge sort]] -->
# The output is in [[monotonic]] order (each element is no smaller/larger than the previous element, according to the required order).
Sorting algorithms used in [[computer science]] are often classified by:
# The output is a [[permutation]] (a reordering, yet retaining all of the original elements) of the input.
 
Although some algorithms are designed for [[sequential access]], the highest-performing algorithms assume data is stored in a [[data structure]] which allows [[random access]].
* [[Computational complexity theory|Computational complexity]] ([[Worst-case performance|worst]], [[Average performance|average]] and [[Best-case performance|best]] behavior) of element comparisons in terms of the size of the list (''n''). For typical sorting algorithms good behavior is O(''n''&nbsp;log&nbsp;''n'') and bad behavior is O(''n''<sup>2</sup>). (See [[Big O notation]].) Ideal behavior for a sort is O(''n''), but this is not possible in the average case. [[Comparison sort|Comparison-based sorting algorithms]], which evaluate the elements of the list via an abstract key comparison operation, need at least O(''n''&nbsp;log&nbsp;''n'') comparisons for most inputs.
 
* [[Computational complexity theory|Computational complexity]] of swaps (for "in place" algorithms).
== History and concepts ==
* Memory usage (and use of other computer resources). In particular, some sorting algorithms are "[[In-place algorithm|in place]]". Strictly, an in place sort needs only O(1) memory beyond the items being sorted; sometimes O(log(''n'')) additional memory is considered "in place".
From the beginning of computing, the sorting problem has attracted a great deal of research, perhaps due to the complexity of solving it efficiently despite its simple, familiar statement. Among the authors of early sorting algorithms around 1951 was [[Betty Holberton]], who worked on [[ENIAC]] and [[UNIVAC]].<ref name="refrigerator">{{Cite web|url=http://mentalfloss.com/article/53160/meet-refrigerator-ladies-who-programmed-eniac|title=Meet the 'Refrigerator Ladies' Who Programmed the ENIAC|website=Mental Floss|access-date=2016-06-16|date=2013-10-13|archive-date=2018-10-08|archive-url=https://web.archive.org/web/20181008094658/http://mentalfloss.com/article/53160/meet-refrigerator-ladies-who-programmed-eniac|url-status=live}}</ref><ref name="NYTimes">{{cite news|last1=Lohr|first1=Steve|title=Frances E. Holberton, 84, Early Computer Programmer|url=https://www.nytimes.com/2001/12/17/business/frances-e-holberton-84-early-computer-programmer.html|access-date=16 December 2014|publisher=NYTimes|date=Dec 17, 2001|archive-date=16 December 2014|archive-url=https://web.archive.org/web/20141216015437/http://www.nytimes.com/2001/12/17/business/frances-e-holberton-84-early-computer-programmer.html|url-status=live}}</ref> [[Bubble sort]] was analyzed as early as 1956.<ref>{{cite thesis |last=Demuth |first=Howard B. |title=Electronic Data Sorting |type=PhD thesis |publisher=Stanford University |year=1956 |id={{ProQuest|301940891}} }}</ref> Asymptotically optimal algorithms have been known since the mid-20th century{{snd}} new algorithms are still being invented, with the widely used [[Timsort]] dating to 2002, and the [[library sort]] being first published in 2006.
* Recursion. Some algorithms are either recursive or non-recursive, while others may be both (e.g., merge sort).
 
Comparison sorting algorithms have a fundamental requirement of [[Big omega notation|Ω(''n'' log ''n'')]] comparisons (some input sequences will require a multiple of ''n'' log ''n'' comparisons, where n is the number of elements in the array to be sorted). Algorithms not based on comparisons, such as [[counting sort]], can have better performance.
 
Sorting algorithms are prevalent in introductory [[computer science]] classes, where the abundance of algorithms for the problem provides a gentle introduction to a variety of core algorithm concepts, such as [[big O notation]], [[divide-and-conquer algorithm]]s, [[data structure]]s such as [[heap (data structure)|heap]]s and [[binary tree]]s, [[randomized algorithm]]s, [[best, worst and average case]] analysis, [[time–space tradeoff]]s, and [[upper and lower bounds]].
 
Sorting small arrays optimally (in the fewest comparisons and swaps) or fast (i.e. taking into account machine-specific details) is still an open research problem, with solutions only known for very small arrays (<20 elements). Similarly optimal (by various definitions) sorting on a parallel machine is an open research topic.
 
== Classification ==<!-- This section is linked from [[Merge sort]] -->
Sorting algorithms can be classified by:
* [[Computational complexity theory|Computational complexity]]
** [[Best, worst and average case]] behavior in terms of the size of the list. For typical serial sorting algorithms, good behavior is O(''n''&nbsp;log&nbsp;''n''), with parallel sort in O(log<sup>2</sup>&nbsp;''n''), and bad behavior is O(''n''<sup>2</sup>). Ideal behavior for a serial sort is O(''n''), but this is not possible in the average case. Optimal parallel sorting is O(log&nbsp;''n'').
** Swaps for "in-place" algorithms.
* [[Memory (computing)|Memory]] usage (and use of other computer resources). In particular, some sorting algorithms are "[[in-place]]". Strictly, an in-place sort needs only O(1) memory beyond the items being sorted; sometimes O(log&nbsp;''n'') additional memory is considered "in-place".
* Recursion: Some algorithms are either typically recursive or typically non-recursive, while others may typically be both (e.g., merge sort).
* Stability: [[#Stability|stable sorting algorithms]] maintain the relative order of records with equal keys (i.e., values).
* Whether or not they are a [[comparison sort]]. A comparison sort examines the data only by comparing two elements with a comparison operator.
* General method: insertion, exchange, selection, merging, ''etc.'' Exchange sorts include bubble sort and quicksort. Selection sorts include shakercycle sort and heapsort.
* Whether the algorithm is serial or parallel. The remainder of this discussion almost exclusively concentrates on serial algorithms and assumes serial operation.
* Adaptability: Whether or not the presortedness of the input affects the running time. Algorithms that take this into account are known to be [[Adaptive sort|adaptive]].
* Online: An algorithm such as Insertion Sort that is online can sort a constant stream of input.
 
=== Stability ===
[[File:Sorting stability playing cards.svg|thumb|An example of stable sort on playing cards. When the cards are sorted by rank with a stable sort, the two 5s must remain in the same order in the sorted output that they were originally in. When they are sorted with a non-stable sort, the 5s may end up in the opposite order in the sorted output.]]
 
Stable sortingsort algorithms maintainsort equal elements in the relativesame order ofthat recordsthey withappear equalin the keysinput. (AFor keyexample, isin thatthe portioncard ofsorting theexample recordto whichthe isright, the basiscards forare thebeing sort;sorted itby maytheir orrank, mayand nottheir includesuit allis ofbeing the recordignored.) This Ifallows allthe keyspossibility areof multiple different thencorrectly thissorted distinctionversions isof notthe necessaryoriginal list. ButStable ifsorting therealgorithms arechoose equalone keysof these, thenaccording ato sortingthe algorithmfollowing is stablerule: if whenevertwo thereitems arecompare twoas recordsequal (let'slike saythe Rtwo and5 Scards), withthen thetheir samerelative keyorder will be preserved, andi.e. Rif appearsone comes before Sthe other in the original listinput, then Rit will always appearcome before Sthe other in the sorted listoutput.
When equal elements are indistinguishable, such as with integers, or more generally, any data where the entire element is the key, stability is not an issue. However, assume that the following pairs of numbers are to be sorted by their first component:
 
Stability is important to preserve order over multiple sorts on the same [[data set]]. For example, say that student records consisting of name and class section are sorted dynamically, first by name, then by class section. If a stable sorting algorithm is used in both cases, the sort-by-class-section operation will not change the name order; with an unstable sort, it could be that sorting by section shuffles the name order, resulting in a nonalphabetical list of students.
(4, 2) (3, 7) (3, 1) (5, 6)
 
More formally, the data being sorted can be represented as a record or tuple of values, and the part of the data that is used for sorting is called the ''key''. In the card example, cards are represented as a record (rank, suit), and the key is the rank. A sorting algorithm is stable if whenever there are two records R and S with the same key, and R appears before S in the original list, then R will always appear before S in the sorted list.
In this case, two different results are possible, one which maintains the relative order of records with equal keys, and one which does not:
 
When equal elements are indistinguishable, such as with integers, or more generally, any data where the entire element is the key, stability is not an issue. Stability is also not an issue if all keys are different.
(3, 7) (3, 1) (4, 2) (5, 6) (order maintained)
(3, 1) (3, 7) (4, 2) (5, 6) (order changed)
 
Unstable sorting algorithms may change the relative order of records with equal keys, but stable sorting algorithms never do so. Unstable sorting algorithms can be specially implemented to be stable. One way of doing this is to artificially extend the key comparison, so that comparisons between two objects with otherwise equal keys are decided using the order of the entries in the original datainput orderlist as a tie-breaker. Remembering this order, however, oftenmay involves anrequire additional [[computational complexitytime theory|computationaland cost]]space.
 
One application for stable sorting algorithms is sorting a list using a primary and secondary key. For example, suppose we wish to sort a hand of cards such that the suits are in the order clubs (♣), diamonds (<span style="color:#ff0000">♦</span>), hearts (<span style="color:#ff0000">♥</span>), spades (♠), and within each suit, the cards are sorted by rank. This can be done by first sorting the cards by rank (using any sort), and then doing a stable sort by suit:
Sorting based on a primary, secondary, tertiary, etc. sort key can be done by any sorting method, taking all sort keys into account in comparisons (in other words, using a single composite sort key). If a sorting method is stable, it is also possible to sort multiple times, each time with one sort key. In that case the keys need to be applied in order of increasing priority.
 
[[File:Sorting playing cards using stable sort.svg|400px]]
Example: sorting pairs of numbers as above by second, then first component:
 
Within each suit, the stable sort preserves the ordering by rank that was already done. This idea can be extended to any number of keys and is utilised by [[radix sort]]. The same effect can be achieved with an unstable sort by using a lexicographic key comparison, which, e.g., compares first by suit, and then compares by rank if the suits are the same.
(4, 2) (3, 7) (3, 1) (5, 6) (original)
 
== Comparison of algorithms ==
(3, 1) (4, 2) (5, 6) (3, 7) (after sorting by second component)
This analysis assumes that the length of each key is constant and that all comparisons, swaps and other operations can proceed in constant time.
(3, 1) (3, 7) (4, 2) (5, 6) (after sorting by first component)
 
Legend:
On the other hand:
* {{mvar|n}} is the number of records to be sorted.
* Comparison column has the following ranking classifications: "Best", "Average" and "Worst" if the [[time complexity]] is given for each case.
* "Memory" denotes the amount of additional storage required by the algorithm.
* The run times and the memory requirements listed are inside [[big O notation]], hence the base of the logarithms does not matter.
* The notation {{math|log<sup>2</sup> ''n''}} means {{math|(log ''n'')<sup>2</sup>}}.
 
=== Comparison sorts ===
(3, 7) (3, 1) (4, 2) (5, 6) (after sorting by first component)
Below is a table of [[comparison sort]]s. [[Analysis of algorithms|Mathematical analysis]] demonstrates a comparison sort cannot perform better than {{math|''O''(''n'' log ''n'')}} on average.<ref>{{citation |last1=Cormen |first1=Thomas H. |author1-link=Thomas H. Cormen |last2=Leiserson |first2=Charles E. |author2-link=Charles E. Leiserson |last3=Rivest |first3=Ronald L. |author3-link=Ron Rivest |last4=Stein |first4=Clifford |author4-link=Clifford Stein|title=Introduction To Algorithms|url=https://books.google.com/books?id=NLngYyWFl_YC|edition=3rd |place=Cambridge, MA |publisher=The MIT Press |year=2009 |isbn=978-0-262-03293-3| page=167 |chapter=8}}</ref>
(3, 1) (4, 2) (5, 6) (3, 7) (after sorting by second component,
order by first component is disrupted).
 
== Comparison of algorithms ==
In this table, ''n'' is the number of records to be sorted. The columns "Average" and "Worst" give the time complexity in each case, under the assumption that the length of each key is constant, and that therefore all comparisons, swaps, and other needed operations can proceed in constant time. "Memory" denotes the amount of auxiliary storage needed beyond that used by the list itself, under the same assumption. These are all [[comparison sort]]s. The run time and the memory of algorithms could be measured using various notations like theta, omega, Big-O, small-o, etc. The memory and the run times below are applicable for all the 5 notations.
 
{|class="wikitable sortable"
|+ [[Comparison sort]]s
! Name !! Best !! Average !! Worst<br /> !! Memory !! Stable
!In-place!! Method<br /> !! width="350"|Other notes
<!-- Sorting Guide:
00 -= constant,
05 = log 05 - lg(n),
10 -= n^c, (0 < c < 1),
15 -= n,
20 -= n*lg(log n) or lg(log n!),
23 -= n*(lg(log n))^2 or n^c, (1 < c < 2),
25 -= n^2,
30 -= n^c, (c > 2),
40 -= c^n, (c > 1),
45 -= n!,
50 -= miscellaneousother -->
|- align="center"
| [[QuicksortHeapsort]]
|style="background:#ddffddffd"| {{Sort|20|<math>\mathcal{} n \log n</math>}}
|style="background:#ddffdddfd"| {{Sort|20|<math>\mathcal{} n \log n</math>}}
|style="background:#ffdddddfd"| {{Sort|2520|<math>n \mathcal{}log n^2</math>}}
|style="background:#ffffdddfd"| {{Sort|0500|<math>\mathcal{} \log n</math>1}}
|style="background:#ffffddfdd"| Depends No
|style="background:#dfd"| Yes
| Partitioning
| Selection
| align="left" | Quicksort is usually done in place with O(log(''n'')) stack space.{{citation needed|date=December 2010}} Most implementations are unstable, as stable in-place partitioning is more complex. [[Naïve algorithm|Naïve]] variants use an O(''n'') space array to store the partition.{{citation needed|date=December 2010}}<!-- see talk page discussion for December 2010 -->
|align="left"| An optimized version of selection sort. Performs selection sort by constructing and maintaining a max heap to find the maximum in <math>O(\log n)</math> time.
|- align="center"
| [[Merge sortIntrosort]]
|style="background:#ddffddffd"| {{Sort|20|<math>\mathcal{} {n \log n} </math>}}
|style="background:#ddffdddfd"| {{Sort|20|<math>\mathcal{} {n \log n} </math>}}
|style="background:#ddffdddfd"| {{Sort|20|<math>\mathcal{} {n \log n} </math>}}
|style="background:#ffddddffd"| {{Sort|1505|Depends; worst case is <math> \mathcal{}log n </math>}}
|style="background:#ddffddfdd"| YesNo
|style="background:#dfd"| Yes
| Merging
| Partitioning & Selection
| align="left" | [[Merge_sort#Parallel_processing|Highly parallelizable]] (up to O(log(''n''))) for processing large amounts of data.
|align="left"| Used in several [[Standard Template Library|STL]] implementations. Performs a combination of Quicksort, Heapsort, and Insertion sort.
|- align="center"
|nowrap|[[In-place]] [[Merge sort]]
|style="background:#ffd"| {{Sort|5020|<math>n \mathcal{} -log n</math>}}
|style="background:#dfd"| {{Sort|5020|<math>n \mathcal{} -log n</math>}}
|style="background:#ffffdddfd"| {{Sort|2320|<math> \mathcal{} {n \left( \log n \right)^2} </math>}}
|style="background:#ddffddfdd"| {{Sort|0015|<math> \mathcal{} {1mvar|n}} </math>}}
|style="background:#ddffdddfd"| Yes
|style="background:#fdd"| No
| Merging
|align="left"| [[Merge sort#Parallel merge sort|Highly parallelizable]] (up to {{math|''O''(log ''n'')}} using the Three Hungarians' Algorithm).<ref>{{Cite conference | doi = 10.1145/800061.808726| title = An {{math|O(n log n)}} sorting network| work = Proceedings of the fifteenth annual ACM symposium on Theory of computing | conference = [[Symposium on Theory of Computing|STOC]] '83| pages = 1–9| year = 1983| last1 = Ajtai | first1 = M. |author-link1 = Miklós Ajtai| last2 = Komlós | first2 = J. |author-link2 = János Komlós (mathematician)| last3 = Szemerédi | first3 = E. |author-link3 = Endre Szemerédi| isbn = 0-89791-099-0}}</ref>
| align="left" | Implemented in Standard Template Library (STL);<ref>http://www.sgi.com/tech/stl/stable_sort.html</ref> can be implemented as a stable sort based on stable in-place merging.<ref>http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.54.8381</ref>
 
|- align="center"
| [[Merge sort#In-place merge sort|In-Place Merge Sort]]
|[[Heapsort]]
| style="background:#ddffddffd" | {{Sort|20|<math>\mathcal{} {n \log^2 n} </math>}}
| style="background:#ddffddffd" | {{Sort|20|<math>\mathcal{} {n \log^2 n} </math>}}
| style="background:#ddffddffd" | {{Sort|20|<math>\mathcal{} {n \log^2 n} </math>}}
| style="background:#ddffddffd" | {{Sort|00|<math>\mathcal{} {1}log n</math>}}
| style="background:#ffdddddfd"| No| Yes
| style="background:#dfd" | Yes
| Selection
| Merging
| align="left" |
| align="left" | Variation of Mergesort which uses an <math>O(n \log n)</math> in-place stable merge algorithm, such as rotate merge or symmerge.
 
|- align="center"
| [[InsertionTournament sort]]
| style="background:#ddffddffd" | {{Sort|1520|<math>n \mathcal{}log n </math>}}
| style="background:#ffdddddfd" | {{Sort|2520|<math>n \mathcal{}log n^2 </math>}}
| style="background:#ffdddddfd" | {{Sort|2520|<math>n \mathcal{}log n^2 </math>}}
| style="background:#ddffddfdd" | {{Sort|0015|<math> \mathcal{} {1mvar|n}} </math>}}
| style="background:#ddffdddfd" | Yes
| style="background:#fdd" | No
| Insertion
| Selection
|align=left|O(''n'' + ''d''), where ''d'' is the number of [[Permutation_groups#Transpositions.2C_simple_transpositions.2C_inversions_and_sorting|inversions]]
| align="left" | An optimization of Selection Sort, which uses a tournament tree to select the min/max.
|- align="center"
| [[IntrosortTree sort]]
| style="background:#ddffddffd" | {{Sort|20|<math>\mathcal{} n \log n</math>}}
| style="background:#ddffdddfd" | {{Sort|20|<math>\mathcal{} n \log n</math>}}
| style="background:#ddffdddfd" | {{Sort|20|<math>\mathcal{} n \log n</math><wbr/>(balanced)}}
| style="background:#ffffddfdd" | {{Sort|0515|<math>\mathcal{} \log {mvar|n</math>}}}}
| style="background:#ffdddddfd"| No| Yes
| style="background:#fdd" | No
| Partitioning & Selection
| Insertion
|align="left"| Used in [[Silicon Graphics|SGI]] [[Standard Template Library|STL]] implementations
| align="left" | When using a [[self-balancing binary search tree]].
|- align="center"
| [[SelectionBlock sort]]
|style="background:#ffdddddfd"| {{Sort|2515|<math> \mathcal{} {mvar|n^2 </math>}}}}
|style="background:#ffdddddfd"| {{Sort|2520|<math>n \mathcal{}log n^2 </math>}}
|style="background:#ffdddddfd"| {{Sort|2520|<math>n \mathcal{}log n^2 </math>}}
|style="background:#ddffdddfd"| {{Sort|00|<math> \mathcal{} {1} </math>}}
|style="background:#ffffdddfd"| NoYes
|style="background:#dfd"| Yes
| Selection
| Insertion & Merging
|align=left| Stable with O(n) extra space, for example using lists.<ref>http://www.algolist.net/Algorithms/Sorting/Selection_sort</ref> Used to sort this table in Safari or other Webkit web browser.<ref>http://svn.webkit.org/repository/webkit/trunk/Source/JavaScriptCore/runtime/ArrayPrototype.cpp</ref>
|align=left| Combine a block-based {{tmath|O(n)}} in-place merge algorithm<ref>{{Cite conference | doi = 10.1007/978-3-540-79228-4_22| title = Ratio Based Stable In-Place Merging| work = Theory and Applications of Models of Computation| conference = [[International Conference on Theory and Applications of Models of Computation|TAMC]] 2008| volume = 4978| pages = 246–257| series = [[LNCS]]| year = 2008| last1 = Kim | first1 = P. S. | last2 = Kutzner | first2 = A. | isbn = 978-3-540-79227-7| citeseerx = 10.1.1.330.2641}}</ref> with a [[Merge sort#Bottom-up implementation|bottom-up merge sort]].
|- align="center"
| [[Smoothsort]]
| style="background:#dfd" | {{Sort|15|{{mvar|n}}}}
| style="background:#dfd" | {{Sort|20|<math>n \log n</math>}}
| style="background:#dfd" | {{Sort|20|<math>n \log n</math>}}
| style="background:#dfd" | {{Sort|00|1}}
| style="background:#fdd" | No
| style="background:#dfd" | Yes
| Selection
| align="left" | [[adaptive sort|Adaptive]] variant of heapsort based on the [[Leonardo number|Leonardo sequence]] instead of a [[binary heap]].
|- align="center"
| [[Timsort]]
| style="background:#ddffdddfd" | {{Sort|15|<math> \mathcal{} {mvar|n} </math>}}}
| style="background:#ddffdddfd" | {{Sort|20|<math> \mathcal{} {n \log n} </math>}}
| style="background:#ddffdddfd" | {{Sort|20|<math> \mathcal{} {n \log n} </math>}}
| style="background:#ffddddfdd" | {{Sort|15|<math> \mathcal{} {mvar|n </math>}}}}
| style="background:#ddffdddfd" | Yes
| style="background:#fdd" | No
| Insertion & Merging
| align="left" | <math>\mathcal{}Makes {''n} </math>-1'' comparisons when the data is already sorted or reverse sorted.
|- align="center"
| [[ShellPatience sortsorting]]
| style="background:#ddffdddfd" | {{Sort|15|<math>\mathcal{} {mvar|n</math>}}}}
| style="background:#ffffdddfd" | {{Sort|2320|<math>\mathcal{} n (\log n)^2</math><br /><br />or<br /><br /><math>\mathcal{} n^{3/2}</math>}}
| style="background:#ffffdddfd" | {{Sort|2320|Depends on gap sequence; best known is <math>\mathcal{} n (\log n)^2</math>}}
| style="background:#ddffddfdd" | {{Sort|0015|<math>\mathcal{{mvar|n}} 1</math>}}
| style="background:#ffddddfdd" | No
| style="background:#fdd" | No
| Insertion & Selection
| align="left" | Finds all the [[longest increasing subsequence]]s in {{math|''O''(''n'' log ''n'')}}.
 
|- align="center"
| [[Cubesort]]
| style="background:#dfd" | {{Sort|15|{{mvar|n}}}}
| style="background:#dfd" | {{Sort|20|<math>n \log n</math>}}
| style="background:#dfd" | {{Sort|20|<math>n \log n</math>}}
| style="background:#fdd" | {{Sort|15|{{mvar|n}}}}
| style="background:#dfd" | Yes
| style="background:#fdd" | No
| Insertion
| align="left" | Makes <math>n-1</math> comparisons when the data is already sorted or reverse sorted.
|align=left|
 
|- align="center"
| [[Bubble sortQuicksort]]
|style="background:#ddffddffd"| {{Sort|1520|<math>n \mathcal{}log n</math>}}
|style="background:#ffdddddfd"| {{Sort|2520|<math>n \mathcal{}log n^2</math>}}
|style="background:#ffddddfdd"| {{Sort|25|<math>\mathcal{} n^2</math>}}
|style="background:#ddffddffd"| {{Sort|0005|<math>\mathcal{}log {1}n</math>}}
|style="background:#ddffddfdd"| Yes <!-- Dispute earlier No. Equal values are never swapped, so they never get out of order -->
|style="background:#dfd"| Yes
| Exchanging
| Partitioning
|align=left| Tiny code size
|align="left"| Quicksort can be done in-place with {{math|''O''(log ''n'')}} stack space.<ref>{{cite book|last=Sedgewick|first=Robert|author-link=Robert Sedgewick (computer scientist)|title=Algorithms In C: Fundamentals, Data Structures, Sorting, Searching, Parts 1-4|url=https://books.google.com/books?id=ylAETlep0CwC|access-date=27 November 2012|edition=3|date=1 September 1998|publisher=Pearson Education|isbn=978-81-317-1291-7}}</ref><ref name=sedgewickQsortPaper>{{Cite journal | last1 = Sedgewick | first1 = R. | author-link1 = Robert Sedgewick (computer scientist)| title = Implementing Quicksort programs | doi = 10.1145/359619.359631 | journal = [[Comm. ACM]] | volume = 21 | issue = 10 | pages = 847–857 | year = 1978 | s2cid = 10020756 }}</ref>
|- align="center"
| [[Introsort#fluxsort|Fluxsort]]
|[[Binary tree sort]]
| style="background:#ddffdddfd" | {{Sort|15|<math>\mathcal{} {mvar|n</math>}}}}
| style="background:#ddffdddfd" | {{Sort|20|<math>\mathcal{} {n \log n} </math>}}
| style="background:#ddffdddfd" | {{Sort|20|<math>\mathcal{} {n \log n} </math>}}
| style="background:#ffddddfdd" | {{Sort|15|<math>\mathcal{} {mvar|n </math>}}}}
| style="background:#ddffdddfd" | Yes
| style="background:#fdd" | No
| Insertion
| Partitioning & Merging
|align="left"| When using a [[self-balancing binary search tree]]
| align="left" | An adaptive branchless stable introsort.
|- align="center"
| [[Introsort#fluxsort|Crumsort]]
|[[Cycle sort]]
| style="background:#dfd" | {{Sort|5015|{{mvar|&mdash;n}}}}
| style="background:#ffdddddfd" | {{Sort|2520|<math>n \mathcal{}log n^2 </math>}}
| style="background:#ffdddddfd" | {{Sort|2520|<math>n \mathcal{}log n^2 </math>}}
|style="background:#ddffddffd"| {{Sort|0005|<math>\mathcal{} {1}log n</math>}}
| style="background:#ffddddfdd" | No
| style="background:#dfd" | Yes
| Partitioning & Merging
| align="left" | An in-place, but unstable variant of Fluxsort.
|- align="center"
| [[Library sort]]
| style="background:#ffd" | {{Sort|20|<math>n \log n</math>}}
| style="background:#dfd" | {{Sort|20|<math>n \log n</math>}}
| style="background:#fdd" | {{Sort|25|<math>n^2</math>}}
| style="background:#fdd" |{{Sort|15|{{mvar|n}}}}
| style="background:#fdd" | No
| style="background:#fdd" | No
| Insertion
| align="left|" In-place with|Similar theoreticallyto optimala numbergapped ofinsertion writessort.
 
|- align="center"
| [[Library sortShellsort]]
| style="background:#ffd" | {{Sort|20|<math>n \log n</math> (Ciura)}}
|{{Sort|50|&mdash;}}
| style="background:#ddffdddfd" | {{Sort|20|<math> \mathcal{} {n \log n} </math> (Ciura)}}
| style="background:#ffddddffd" | {{Sort|2523|<math> \mathcaln^{4/3}</math> (Ciura)<br /><math>n \log^2 n</math> (Pratt)}}
| style="background:#ffdddddfd" | {{Sort|1500|<math> \mathcal{} n </math>1}}
| style="background:#ddffddfdd"| Yes| No
| style="background:#dfd" | Yes
| Insertion
| Insertion
|align=left|
| align="left" | Small code size. Complexity may vary depending on gap sequence. Pratt's sequence has a worst-case of <math>O(n \log^2 n)</math>. The (Extended) Ciura sequence averages <math>O(n \log n)</math> empirically.
 
|- align="center"
| [[PatienceComb sortingsort]]
| style="background:#ffd" | {{Sort|20|<math>n \log n</math>}}
|{{Sort|50|&mdash;}}
| style="background:#fdd" | {{Sort|25|<math>n^2</math>}}
|{{Sort|50|&mdash;}}
| style="background:#ddffddfdd" | {{Sort|2025|<math>\mathcal{} n \log n^2</math>}}
| style="background:#ffdddddfd" | {{Sort|1500|<math>\mathcal{} n</math>1}}
| style="background:#ffddddfdd" | No
| style="background:#dfd" | Yes
| Insertion & Selection
| Exchanging
| align="left" | Finds all the [[longest increasing subsequence]]s within O(''n'' log ''n'')
| align="left" | Faster than bubble sort on average.
 
|- align="center"
| [[SmoothsortInsertion sort]]
| style="background:#ddffdddfd" | {{Sort|15|<math>\mathcal{} {mvar|n} </math>}}}
| style="background:#ddffddfdd" | {{Sort|2025|<math>\mathcal{} {n \log n} ^2</math>}}
| style="background:#ddffddfdd" | {{Sort|2025|<math>\mathcal{} {n \log n} ^2</math>}}
| style="background:#ddffdddfd" | {{Sort|00|<math>\mathcal{} {1} </math>}}
| style="background:#ffdddddfd"| No| Yes
| style="background:#dfd" | Yes
| Selection
| Insertion
| align="left" |An [[adaptive sort]] - <math>\mathcal{} {n} </math> comparisons when the data is already sorted, and 0 swaps.
| align="left" | {{math|''O''(''n'' + ''d'')}}, in the worst case over sequences that have ''d'' [[Inversion (discrete mathematics)|inversions]].
 
|- align="center"
| [[StrandBubble sort]]
| style="background:#ddffdddfd" | {{Sort|15|<math>\mathcal{} {mvar|n </math>}}}}
| style="background:#ffddddfdd" | {{Sort|25|<math>\mathcal{} n^2</math>}}
| style="background:#ffddddfdd" | {{Sort|25|<math>\mathcal{} n^2</math>}}
| style="background:#ffdddddfd" | {{Sort|1500|<math>\mathcal{} n</math>1}}
| style="background:#ddffdddfd" | Yes<!-- Disputed: No. Equal values are never swapped, so they never get out of order -->
| style="background:#dfd" | Yes
| Selection
| Exchanging
| align="left" |
| align="left" | Tiny code size.
 
|- align="center"
| [[TournamentCocktail shaker sort]]
| style="background:#dfd" | {{Sort|5015|{{mvar|&mdash;n}}}}
| style="background:#ddffddfdd" | {{Sort|2025|<math>\mathcal{} n \log n^2</math>}}
| style="background:#ddffddfdd" | {{Sort|2025|<math>\mathcal{} n \log n^2</math>}}
| style="background:#dfd" | {{Sort|00|1}}
|
| style="background:#dfd" | Yes
|
| style="background:#dfd" | Yes
| Selection
| Exchanging
| align="left" |
| align="left" |A bi-directional variant of Bubblesort.
 
|- align="center"
| [[CocktailGnome sort]]
| style="background:#ddffdddfd" | {{Sort|15|<math>\mathcal{} {mvar|n</math>}}}}
| style="background:#ffddddfdd" | {{Sort|25|<math>\mathcal{} n^2</math>}}
| style="background:#ffddddfdd" | {{Sort|25|<math> \mathcal{} n^2 </math>}}
| style="background:#ddffdddfd" | {{Sort|00|<math>\mathcal{} {1} </math>}}
| style="background:#ddffdddfd" | Yes
| style="background:#dfd" | Yes
| Exchanging
| Exchanging
|align=left|
| align="left" | Tiny code size.
 
|- align="center"
| [[CombOdd–even sort]]
| style="background:#ddffdddfd" | {{Sort|15|<math>\mathcal{} {mvar|n</math>}}}}
| style="background:#ddffddfdd" | {{Sort|1525|<math>\mathcal{} n \log n^2</math>}}
| style="background:#ffddddfdd" | {{Sort|25|<math> \mathcal{} n^2 </math>}}
| style="background:#ddffdddfd" | {{Sort|00|<math> \mathcal{} {1} </math>}}
| style="background:#ffdddddfd"| No| Yes
| style="background:#dfd" | Yes
| Exchanging
| Exchanging
|align="left"|Small code size
| align="left" | Can be run on parallel processors easily.
 
|- align="center"
| [[GnomeStrand sort]]
| style="background:#ddffdddfd" | {{Sort|15|<math> \mathcal{} {mvar|n </math>}}}}
| style="background:#ffddddfdd" | {{Sort|25|<math> \mathcal{} n^2 </math>}}
| style="background:#ffddddfdd" | {{Sort|25|<math> \mathcal{} n^2 </math>}}
| style="background:#ddffddfdd" | {{Sort|0015|<math> \mathcal{} {1mvar|n}} </math>}}
| style="background:#ddffdddfd" | Yes
| style="background:#fdd" | No
| Exchanging
| Selection
|align=left| Tiny code size
| align="left" |
 
|- align="center"
| [[BogosortSelection sort]]
| style="background:#ddffddfdd" | {{Sort|1525|<math> \mathcal{} n ^2</math>}}
| style="background:#ffddddfdd" | {{Sort|4525|<math> \mathcal{} n \cdot n! ^2</math>}}
| style="background:#ffddddfdd" | {{Sort|4525|<math> \mathcal{} {n \cdot n! \to \infty} ^2</math>}}
| style="background:#ddffdddfd" | {{Sort|00|<math> \mathcal{} {1} </math>}}
| style="background:#ffddddfdd" | No
| style="background:#dfd" | Yes
| Luck
| Selection
| align="left" | Randomly permute the array and check if sorted.
| align="left" | Tiny code size. Noted for its simplicity and small number of element moves. Makes exactly <math>n</math> swaps.
 
|- align="center"
| [[Exchange sort]]
| style="background:#fdd" | {{Sort|25|<math>n^2</math>}}
| style="background:#fdd" | {{Sort|25|<math>n^2</math>}}
| style="background:#fdd" | {{Sort|25|<math>n^2</math>}}
| style="background:#dfd" | {{Sort|00|1}}
| style="background:#fdd" | No
| style="background:#dfd" | Yes
| Exchanging
| align="left" | Tiny code size.
 
|- align="center"
| [[Cycle sort]]
| style="background:#fdd" | {{Sort|25|<math>n^2</math>}}
| style="background:#fdd" | {{Sort|25|<math>n^2</math>}}
| style="background:#fdd" | {{Sort|25|<math>n^2</math>}}
| style="background:#dfd" | {{Sort|00|1}}
| style="background:#fdd" | No
| style="background:#dfd" | Yes
| Selection
| align="left" | In-place with theoretically optimal number of writes.
|}
 
=== Non-comparison sorts ===
The following table describes [[integer sorting]] algorithms and other sorting algorithms that are not [[comparison sort]]s. As such, they are not limited by a <math>\Omega\left( {n \log n} \right)</math> lower bound. Complexities below are in terms of ''n'', the number of items to be sorted, ''k'', the size of each key, and ''d'', the digit size used by the implementation. Many of them are based on the assumption that the key size is large enough that all entries have unique key values, and hence that ''n'' << 2<sup>''k''</sup>, where << means "much less than."
The following table describes [[integer sorting]] algorithms and other sorting algorithms that are not [[comparison sort]]s. These algorithms are not limited to [[Big O notation|''Ω''(''n'' log ''n'')]] unless meet unit-cost [[random-access machine]] model as described below. <ref>{{citation |last1=Cormen |first1=Thomas H. |author1-link=Thomas H. Cormen |last2=Leiserson |first2=Charles E. |author2-link=Charles E. Leiserson |last3=Rivest |first3=Ronald L. |author3-link=Ron Rivest |last4=Stein |first4=Clifford |author4-link=Clifford Stein|title=Introduction To Algorithms|url=https://books.google.com/books?id=NLngYyWFl_YC|edition=2nd |place=Cambridge, MA |publisher=The MIT Press |year=2001 |isbn=0-262-03293-7| page=165 |chapter=8}}</ref>
 
* Complexities below assume {{mvar|n}} items to be sorted, with keys of size {{mvar|k}}, digit size {{mvar|d}}, and {{mvar|r}} the range of numbers to be sorted.
* Many of them are based on the assumption that the key size is large enough that all entries have unique key values, and hence that {{math|''n'' ≪ 2<sup>''k''</sup>}}, where ≪ means "much less than".
* In the unit-cost [[random-access machine]] model, algorithms with running time of <math>n \cdot \frac{k}{d}</math>, such as radix sort, still take time proportional to <small>{{math|Θ(''n'' log ''n'')}}</small>, because {{mvar|n}} is limited to be not more than <math>2^\frac{k}{d}</math>, and a larger number of elements to sort would require a bigger {{mvar|k}} in order to store them in the memory.<ref>{{cite journal |first=Stefan |last=Nilsson |title=The Fastest Sorting Algorithm? |journal=[[Dr. Dobb's]] |year=2000 |url=http://www.drdobbs.com/architecture-and-design/the-fastest-sorting-algorithm/184404062 |access-date=2015-11-23 |archive-date=2019-06-08 |archive-url=https://web.archive.org/web/20190608084350/http://www.drdobbs.com/architecture-and-design/the-fastest-sorting-algorithm/184404062 |url-status=live }}</ref>
 
{|class="wikitable sortable"
|+ Non-comparison sorts
! Name !! Best !! Average !! Worst<br /> !! Memory<br /> !! Stable !! {{math|''n'' << 2<sup>''k''</sup>}} !! Notes<br />
|- align="center"
| [[Pigeonhole sort]]
| —
|{{Sort|03|&mdash;}}
|style="background:#ddffdddfd"| <math>\;n + 2^k</math>
|style="background:#ddffdddfd"| <math>\;n + 2^k</math>
| <math>\;2^k</math>
| {{Yes}}
|style="background:#ddffdd"| Yes
| {{Yes}}
|align="left"| Cannot sort non-integers.
|
|- align="center"
| [[Bucket sort]] (uniform keys)
| —
|{{Sort|03|&mdash;}}
|style="background:#ddffdddfd"| <math>\;n+k</math>
|style="background:#ffddddfdd"| <math>\;n^2 \cdot k</math>
| <math>\;n \cdot k</math>
| {{Yes}}
|style="background:#ddffdd"| Yes
| {{No}}
|align="left"| Assumes uniform distribution of elements from the ___domain in the array.<ref name="clrs">{{Introduction to Algorithms|edition=2}}</ref>
Also cannot sort non-integers.
|- align="center"
| [[Bucket sort]] (integer keys)
| —
|{{Sort|03|&mdash;}}
|style="background:#ddffdddfd"| <math>\;n+r</math>
|style="background:#ddffdddfd"| <math>\;n+r</math>
| <math>\;n+r</math>
| {{Yes}}
|style="background:#ddffdd"| Yes
| {{Yes}}
|r is the range of numbers to be sorted.align="left"| If ''r'' =is <math>\mathcal{{tmath|O}\left( {n)}} \right)</math>, then Avgaverage RTtime =complexity is <math>\mathcal{{tmath|O}\left( {n} \right)</math>}}.<ref name="gt">{{cite book
| last1 = Goodrich | first1 = Michael T. | author1-link = Michael T. Goodrich
| last2 = Tamassia | first2 = Roberto | author2-link = Roberto Tamassia
Line 294 ⟶ 403:
| publisher = John Wiley & Sons
| title = Algorithm Design: Foundations, Analysis, and Internet Examples
| year = 2002}}</ref>
| isbn = 978-0-471-38365-9}}</ref>
|- align="center"
| [[Counting sort]]
| —
|{{Sort|03|&mdash;}}
|style="background:#ddffdddfd"| <math>\;n+r</math>
|style="background:#ddffdddfd"| <math>\;n+r</math>
| <math>\;n+r</math>
| {{Yes}}
|style="background:#ddffdd"| Yes
| {{Yes}}
|r is the range of numbers to be sorted.align="left"| If ''r'' =is <math>\mathcal{{tmath|O}\left( {n)} \right)</math>}, then Avgaverage RTtime =complexity <math>\mathcalis {{tmath|O}\left( {n)} \right)</math>}.<ref name="clrs" />
|- align="center"
| [[Radix sort#Least significant digit radix sorts|LSD Radix Sort]]
|style="background:#dfd"| <math>n</math>
|{{Sort|03|&mdash;}}
|style="background:#ddffdddfd"| <math>\;n \cdot \frac{k}{d}</math>
|style="background:#ddffdddfd"| <math>\;n \cdot \frac{k}{d}</math>
| <math>\mathcal{} n + 2^d</math>
| {{Yes}}
|style="background:#ddffdd"| Yes
| {{No}}
|align="left"|<math>\frac{k}{d}</math> recursion levels, 2<sup>''d''</sup> for count array.<ref name="clrs" /><ref name="gt" />
Unlike most distribution sorts, this can sort non-integers.
|- align="center"
| [[Radix sort#Most significant digit radix sorts|MSD Radix Sort]]
| —
|{{Sort|03|&mdash;}}
|style="background:#ddffdddfd"| <math>\;n \cdot \frac{k}{d}</math>
|style="background:#ddffdddfd"| <math>\;n \cdot \frac{k}{d}</math>
| <math>\mathcal{} n + \frac{k}{d} \cdot 2^d </math>
| {{Yes}}
|style="background:#ddffdd"| Yes
| {{No}}
|align="left"| Stable version uses an external array of size {{mvar|n}} to hold all of the bins.
Same as the LSD variant, it can sort non-integers.
|- align="center"
| [[Radix sort#Most significant digit radix sorts|MSD Radix Sort]] (in-place)
| —
|{{Sort|03|&mdash;}}
|style="background:#ddffdddfd"| <math>\;n \cdot \frac{k}{d1}</math>
|style="background:#ddffdddfd"| <math>\;n \cdot \frac{k}{d1}</math>
| <math>\frac{k}{d} \cdot 2^d1</math>
| {{No}}
|style="background:#ffdddd"| No
| {{No}}
| In-Place. k /align="left"| d=1 recursionfor levelsin-place, 2<supmath>dk/1</supmath> forrecursion levels, no count array.
|- align="center"
| [[Spreadsort]]
|style="background:#dfd"| {{mvar|n}}
|{{Sort|03|&mdash;}}
|style="background:#ddffdddfd"| <math>\;n \cdot \frac{k}{d}</math>
|style="background:#ddffdddfd"| <math>\;n \cdot \left( {\frac{k}{s} + d} \right)</math>
| <math>\;\frac{k}{d} \cdot 2^d</math>
| {{No}}
|style="background:#ffdddd"| No
| {{No}}
|align="left"| AsymptoticsAsymptotic are based on the assumption that {{math|''n'' << 2<sup>''k''</sup>}}, but the algorithm does not require this.
|- align="center"
| [[Burstsort]]
| —
|style="background:#dfd"| <math>n \cdot \frac{k}{d}</math>
|style="background:#dfd"| <math>n \cdot \frac{k}{d}</math>
| <math>n \cdot \frac{k}{d}</math>
| {{No}}
| {{No}}
|align="left"| Has better constant factor than radix sort for sorting strings. Though relies somewhat on specifics of commonly encountered strings.
|- align="center"
| [[Flashsort]]
|style="background:#dfd"| {{mvar|n}}
|style="background:#dfd"| <math>n+r</math>
|style="background:#fdd"| <math>n^2</math>
| {{mvar|n}}
| {{No}}
| {{No}}
|align="left"| Requires uniform distribution of elements from the ___domain in the array to run in linear time. If distribution is extremely skewed then it can go quadratic if underlying sort is quadratic (it is usually an insertion sort). In-place version is not stable.
|- align="center"
| [[Postman sort]]
| —
|style="background:#dfd"| <math>n \cdot \frac{k}{d}</math>
|style="background:#dfd"| <math>n \cdot \frac{k}{d}</math>
| <math>n+2^d</math>
| —
| {{No}}
|align="left"| A variation of bucket sort, which works very similarly to MSD Radix Sort. Specific to post service needs.
|- align="center"
| [[Recombinant sort]]
| style="background:#dfd" | {{Sort|25|<math>n+r</math>}}
| style="background:#dfd" | {{Sort|25|<math>n+r</math>}}
| style="background:#dfd" | {{Sort|25|<math>n+r</math>}}
| style="background:#fdd" | {{Sort|10|<math>n k</math>}}
| {{No}}
| {{No}}
| Hashing, Counting, Dynamic Programming, Multidimensional data
|}
 
[[Samplesort]] can be used to parallelize any of the non-comparison sorts, by efficiently distributing data into several buckets and then passing down sorting to several processors, with no need to merge as buckets are already sorted between each other.
The following table describes some sorting algorithms that are impractical for real-life use due to extremely poor performance or a requirement for specialized hardware.
 
=== Others ===
Some algorithms are slow compared to those discussed above, such as the [[bogosort]] with unbounded run time and the [[stooge sort]] which has ''O''(''n''<sup>2.7</sup>) run time. These sorts are usually described for educational purposes to demonstrate how the run time of algorithms is estimated. The following table describes some sorting algorithms that are impractical for real-life use in traditional software contexts due to extremely poor performance or specialized hardware requirements.
 
{|class="wikitable sortable"
! Name !! Best !! Average !! Worst !! Memory !! Stable !! Comparison !! Other notes
|- align="center"
| [[Bead sort]]
|style="background:#dfd"| {{Sort|0315|{{mvar|&mdash;n}}}}
|style="background:#ffd"| {{Sort|23|{{mvar|S}}}}
| N/A
|style="background:#ffd"| {{Sort|23|{{mvar|S}}}}
| N/A
|style="background:#fdd"| {{Sort|25|<math>n^2</math>}}
|&mdash;
| {{N/A }}
| {{No}}
| align="left"| Works only with positive integers. Requires specialized hardware for it to run in guaranteed {{tmath|O(n)}} time. There is a possibility for software implementation, but running time will be {{tmath|O(S)}}, where {{mvar|S}} is the sum of all integers to be sorted; in the case of small integers, it can be considered to be linear.
|nowrap align="left"| Requires specialized hardware
|- align="center"
| [[Pancake sorting|Simple pancakeMerge-insertion sort]]
| {{Sort|0321|&mdash;<math>n\log n</math><br />comparisons}}
| {{Sort|21|<math>n\mathcal{}log n </math><br />comparisons}}
| {{Sort|21|<math>n\mathcal{}log n </math><br />comparisons}}
| {{Sort|21|Varies}}
| <math>\mathcal{} {\log n} </math>
| {{No}}
| style="background:#ffdddd" | No
| {{Yes}}
|align="left"| Makes very few comparisons worst case compared to other sorting algorithms.
| nowrap align="left" | Count is number of flips.
Mostly of theoretical interest due to implementational complexity and suboptimal data moves.
|- align="center"
| "I Can't Believe It Can Sort"<ref>{{cite arXiv |last1=Fung |first1=Stanley P. Y. |title=Is this the simplest (and most surprising) sorting algorithm ever? |eprint=2110.01111 |date=3 October 2021|class=cs.DS }}</ref>
|[[Spaghetti sort|Spaghetti (Poll) sort]]
|style="background:#ddffddfdd"| {{Sort|1525|{{mvar|<math>\mathcal{} n^2</math>}}}}
|style="background:#ddffddfdd"| {{Sort|1525|{{mvar|<math>\mathcal{} n^2</math>}}}}
|style="background:#ddffddfdd"| {{Sort|1525|{{mvar|<math>\mathcal{} n^2</math>}}}}
|style="background:#dfd"| {{Sort|00|1}}
|style="background:#ffdddd"|{{Sort|25|<math>\mathcal{} n^2</math>}} <!-- space should reflect amount of spaghetti needed; one rod must be at least n units long; n rods are needed. -->
| {{No}}
|style="background:#ddffdd"| Yes
| {{Yes}}
|align="left"| Notable primarily for appearing to be an erroneous implementation of either [[Insertion Sort]] or [[#Exchange sort|Exchange Sort]].
|- align="center"
| [[Spaghetti sort|Spaghetti (Poll) sort]]
|style="background:#dfd"| {{Sort|15|{{mvar|n}}}}
|style="background:#dfd"| {{Sort|15|{{mvar|n}}}}
|style="background:#dfd"| {{Sort|15|{{mvar|n}}}}
|style="background:#fdd"| {{Sort|25|<math>n^2</math>}}<!-- space should reflect amount of spaghetti needed; one rod must be at least n units long; n rods are needed. -->
| {{Yes}}
| Polling
| align="left" | This Ais a linear-time, analog algorithm for sorting a sequence of items, requiring ''O''(''n'') stack space, and the sort is stable. This requires <math>''n</math>'' parallel processors. [[SpaghettiSee {{section link|spaghetti sort#Analysis]]}}.<!-- see talk page discussion for June 2011 -->
|- align="center"
| [[Sorting network]]
| {{Sort|06|Varies}}
| {{Sort|06|Varies}}
| {{Sort|06|Varies}}
| {{Sort|21|Varies}}
| {{Varies}} (stable sorting networks require more comparisons)
| {{Yes}}
|align="left"| Order of comparisons are set in advance based on a fixed network size.
|- align="center"
| [[Bitonic sorter]]
| {{Sort|06|<math>\log^2 n</math> parallel}}
| {{Sort|06|<math>\log^2 n</math> parallel}}
| {{Sort|06|<math>n \log^2 n</math> non-parallel}}
|style="background:#dfd"| {{Sort|00|1}}
| {{No}}
| {{Yes}}
|align="left"| An effective variation of Sorting networks. {{disputed inline|reason=I thought I heard that Batcher made odd-even merge sort to supersede bitonic.|date=June 2021}}
|- align="center"
| [[Bogosort]]
|style="background:#dfd"| {{Sort|15|{{mvar|n}}}}
|style="background:#fdd"| {{Sort|99|<math>(n\times n!)</math>}}
|style="background:#fdd"| {{Sort|99|Unbounded}}
|style="background:#dfd"| {{Sort|00|1}}
| {{No}}
| {{Yes}}
|align=left| Random shuffling. Used for example purposes only, as even the expected best-case runtime is awful.<ref name="Fun07">{{citation
| last1 = Gruber
| first1 = H.
| last2 = Holzer
| first2 = M.
| last3 = Ruepp
| first3 = O.
| contribution = Sorting the slow way: an analysis of perversely awful randomized sorting algorithms
| doi = 10.1007/978-3-540-72914-3_17
| pages = 183–197
| publisher = Springer-Verlag
| series = Lecture Notes in Computer Science
| title = 4th International Conference on Fun with Algorithms, Castiglioncello, Italy, 2007
| date = 2007
| url = http://www.hermann-gruber.com/pdf/fun07-final.pdf
| volume = 4475
| isbn = 978-3-540-72913-6
| access-date = 2020-06-27
| archive-date = 2020-09-29
| archive-url = https://web.archive.org/web/20200929161057/http://www.hermann-gruber.com/pdf/fun07-final.pdf
| url-status = live
}}.</ref>
Worst case is unbounded when using randomization, but a deterministic version guarantees <math>O(n\times n!)</math> worst case.
|- align="center"
| [[Stooge sort]]
|style="background:#fdd"| {{Sort|30|<math>n^{\log 3/\log 1.5}</math>}}
|style="background:#fdd"| {{Sort|30|<math>n^{\log 3/\log 1.5}</math>}}
|style="background:#fdd"| {{Sort|30|<math>n^{\log 3/\log 1.5}</math>}}
|style="background:#fdd"| {{Sort|15|{{mvar|n}}}}
| {{No}}
| {{Yes}}
|align="left"| Slower than most of the sorting algorithms (even naive ones) with a time complexity of {{math|1=''O''(''n''<sup>log 3 / log 1.5 </sup>) = ''O''(''n''<sup>2.7095...</sup>)}} Can be made stable, and is also a [[sorting network]].
|- align="center"
| [[Slowsort]]
|style="background:#fdd"| {{Sort|30|<math>n^{\Omega(\log n)}</math>}}
|style="background:#fdd"| {{Sort|30|<math>n^{\Omega(\log n)}</math>}}
|style="background:#fdd"| {{Sort|30|<math>n^{\Omega(\log n)}</math>}}
|style="background:#fdd"| {{Sort|15|{{mvar|n}}}}
| {{No}}
| {{Yes}}
|align="left"| A multiply and surrender algorithm, antonymous with [[divide-and-conquer algorithm]].
|- align="center"
| Franceschini's method<ref>{{Cite journal | doi = 10.1007/s00224-006-1311-1| title = Sorting Stably, in Place, with O(n log n) Comparisons and O(n) Moves| journal = Theory of Computing Systems| volume = 40| issue = 4| pages = 327–353
|[[Sorting network]]s
| date = June 2007| last1 = Franceschini | first1 = G. }}</ref>
|{{Sort|03|&mdash;}}
|nowrap{{Sort|20|<math>\mathcal{} {\log n} -</math>}}
|nowrapstyle="background:#dfd"| {{Sort|20|<math>\mathcal{} {n\log n} </math>}}
|style="background:#dfd"| {{Sort|20|<math>\mathcal{} {n \cdot \log (n)} </math>}}
|style="background:#ddffdddfd"| Yes {{Sort|00|1}}
| {{Yes}} <!--Franceschini's 2007 sort IS STABLE. Not to be confused with Franceschini's 2003 sort which is NOT STABLE-->
| No
| {{Yes}}
|nowrap align="left"| Requires a custom circuit of size <math>\mathcal{O}\left( n \cdot \log (n) \right)</math>
|align="left"| Makes {{math|''O''(''n'')}} data moves in the worst case. Possesses ideal comparison sort asymptotic bounds but is only of theoretical interest.
|}
Theoretical computer scientists have invented other sorting algorithms that provide better than ''O''(''n'' log ''n'') time complexity assuming certain constraints, including:
* Thorup's algorithm<ref name=":0" />, a randomized [[integer sorting]] algorithm, taking {{math|''O''(''n'' log log ''n'')}} time and ''O''(''n'') space.<ref name=":0">{{Cite journal |doi=10.1006/jagm.2002.1211 |title=Randomized Sorting in O(n log log n) Time and Linear Space Using Addition, Shift, and Bit-wise Boolean Operations |journal=Journal of Algorithms |volume=42 |issue=2 |pages=205–230 |date=February 2002 |last1=Thorup |first1=M. |s2cid=9700543 |author1-link = Mikkel Thorup}}</ref>
* AHNR algorithm,<ref>{{Cite conference
| last1 = Andersson
| first1 = Arne
| last2 = Hagerup
| first2 = Torben
| last3 = Nilsson
| first3 = Stefan
| last4 = Raman
| first4 = Rajeev
| title = Sorting in linear time?
| book-title = Proceedings of the twenty-seventh annual ACM symposium on Theory of computing
| editor =
| publisher = ACM
| ___location =
| pages = 427-436
| date =
| year = 1995
| url =
| doi =
}}</ref> an [[integer sorting]] algorithm which runs in <math>O(n\log\log n)</math> time deterministically, and also has a randomized version which runs in linear time when words are large enough, specifically <math>w\ge (\log n)^{2+\varepsilon}</math> (where ''w'' is the word size).
* A randomized [[integer sorting]] algorithm taking <math>O\left(n \sqrt{\log \log n}\right)</math> expected time and ''O''(''n'') space.<ref>{{Cite conference |doi=10.1109/SFCS.2002.1181890 |title=Integer sorting in O(n√(log log n)) expected time and linear space |conference=The 43rd Annual IEEE [[Symposium on Foundations of Computer Science]] |pages=135–144 |year=2002 |first1=Yijie |last1=Han |last2=Thorup |first2=M. |author2-link = Mikkel Thorup |isbn=0-7695-1822-2}}</ref>
 
== Popular sorting algorithms ==
Additionally, theoretical computer scientists have detailed other sorting algorithms that provide better than <math>\mathcal{O}\left( {n \log n} \right)</math> time complexity with additional constraints, including:
While there are a large number of sorting algorithms, in practical implementations a few algorithms predominate. Insertion sort is widely used for small data sets, while for large data sets an asymptotically efficient sort is used, primarily heapsort, merge sort, or quicksort. Efficient implementations generally use a [[hybrid algorithm]], combining an asymptotically efficient algorithm for the overall sort with insertion sort for small lists at the bottom of a recursion. Highly tuned implementations use more sophisticated variants, such as [[Timsort]] (merge sort, insertion sort, and additional logic), used in [[Android (operating system)|Android]], [[Java (programming language)|Java]], and [[Python (programming language)|Python]], and [[introsort]] (quicksort and heapsort), used (in variant forms) in some [[sort (C++)|C++ sort]] implementations and in [[.NET]].
 
For more restricted data, such as numbers in a fixed interval, [[#Distribution sorts|distribution sorts]] such as counting sort or radix sort are widely used. Bubble sort and variants are rarely used in practice, but are commonly found in teaching and theoretical discussions.
* Han's algorithm, a deterministic algorithm for sorting keys from a [[Domain of a function|___domain]] of finite size, taking <math>\mathcal{O}\left( {n \log \log n} \right)</math> time and <math>\mathcal{O}\left( {n} \right)</math> space.<ref>Y. Han. ''Deterministic sorting in <math>\mathcal{O}\left( {n \log \log n} \right)</math> time and linear space''. Proceedings of the thirty-fourth annual ACM symposium on Theory of computing, Montreal, Quebec, Canada, 2002,p.602-608.</ref>
* Thorup's algorithm, a randomized algorithm for sorting keys from a ___domain of finite size, taking <math>\mathcal{O}\left( {n \log \log n} \right)</math> time and <math>\mathcal{O}\left( {n} \right)</math> space.<ref>[[Mikkel Thorup|M. Thorup]]. ''Randomized Sorting in <math>\mathcal{O}\left( {n \log \log n} \right)</math> Time and Linear Space Using Addition, Shift, and Bit-wise Boolean Operations''. Journal of Algorithms, Volume 42, Number 2, February 2002, pp. 205-230(26)</ref>
* An [[integer]] sorting algorithm taking <math>\mathcal{O}\left( {n \sqrt{\log \log n}} \right)</math> expected time and <math>\mathcal{O}\left( {n} \right)</math> space.<ref>Han, Y. and [[Mikkel Thorup|Thorup, M.]] 2002. Integer Sorting in <math>\mathcal{O}\left( {n \sqrt{\log \log n}} \right)</math> Expected Time and Linear Space. In ''Proceedings of the 43rd Symposium on Foundations of Computer Science'' (November 16–19, 2002). FOCS. IEEE Computer Society, Washington, DC, 135-144.</ref>
 
When physically sorting objects (such as alphabetizing papers, tests or books) people intuitively generally use insertion sorts for small sets. For larger sets, people often first bucket, such as by initial letter, and multiple bucketing allows practical sorting of very large sets. Often space is relatively cheap, such as by spreading objects out on the floor or over a large area, but operations are expensive, particularly moving an object a large distance – [[locality of reference]] is important. Merge sorts are also practical for physical objects, particularly as two hands can be used, one for each list to merge, while other algorithms, such as heapsort or quicksort, are poorly suited for human use. Other algorithms, such as [[library sort]], a variant of insertion sort that leaves spaces, are also practical for physical use.
Algorithms not yet compared above include:
* [[Odd-even sort]]
* [[Flashsort]]
* [[Burstsort]]
* [[Postman sort]]
* [[Stooge sort]]
* [[Samplesort]]
* [[Bitonic sorter]]
 
=== Simple sorts ===
== Summaries of popular sorting algorithms==
Two of the simplest sorts are insertion sort and selection sort, both of which are efficient on small data, due to low overhead, but not efficient on large data. Insertion sort is generally faster than selection sort in practice, due to fewer comparisons and good performance on almost-sorted data, and thus is preferred in practice, but selection sort uses fewer writes, and thus is used when write performance is a limiting factor.
=== Bubble sort ===
[[File:Bubblesort-edited.png|thumb|right|A bubble sort, a sorting algorithm that continuously steps through a list, [[Swap (computer science)|swapping]] items until they appear in the correct order.]]
{{Main|Bubble sort}}
 
==== Insertion sort ====
''Bubble sort'' is a simple sorting algorithm. The algorithm starts at the beginning of the data set. It compares the first two elements, and if the first is greater than the second, it swaps them. It continues doing this for each pair of adjacent elements to the end of the data set. It then starts again with the first two elements, repeating until no swaps have occurred on the last pass. This algorithm's average and worst case performance is O(''n''<sup>2</sup>), so it is rarely used to sort large, unordered, data sets. Bubble sort can be used to sort a small number of items (where its asymptotic inefficiency is not a high penalty). Bubble sort can also be used efficiently on a list of any length that is nearly sorted (that is, the elements are not significantly out of place). For example, if any number of elements are out of place by only one position (e.g. 0123546789 and 1032547698), bubble sort's exchange will get them in order on the first pass, the second pass will find all elements in order, so the sort will take only 2''n'' time.
{{Main|Insertion sort}}
''[[Insertion sort]]'' is a simple sorting algorithm that is relatively efficient for small lists and mostly sorted lists, and is often used as part of more sophisticated algorithms. It works by taking elements from the list one by one and inserting them in their correct position into a new sorted list similar to how one puts money in their wallet.<ref>{{cite book |last=Wirth |first=Niklaus |author-link=Niklaus Wirth |title=Algorithms & Data Structures |place=Upper Saddle River, NJ |publisher=Prentice-Hall |year=1986 |isbn=978-0130220059 |pages=76–77}}</ref> In arrays, the new list and the remaining elements can share the array's space, but insertion is expensive, requiring shifting all following elements over by one. [[Shellsort]] is a variant of insertion sort that is more efficient for larger lists.
 
==== Selection sort ====
{{Main|Selection sort}}
''Selection sort'' is an [[in-place]] [[comparison sort]]. It has [[Big O notation|O]](''n''<sup>2</sup>) complexity, making it inefficient on large lists, and generally performs worse than the similar [[insertion sort]]. Selection sort is noted for its simplicity and also has performance advantages over more complicated algorithms in certain situations.
 
The algorithm finds the minimum value, swaps it with the value in the first position, and repeats these steps for the remainder of the list.<ref>{{harvnb|Wirth|1986|pp=79–80}}</ref> It does no more than ''n'' swaps and thus is useful where swapping is very expensive.
''Selection sort'' is an [[in-place algorithm|in-place]] [[comparison sort]]. It has [[Big O notation|O]](''n''<sup>2</sup>) complexity, making it inefficient on large lists, and generally performs worse than the similar [[insertion sort]]. Selection sort is noted for its simplicity, and also has performance advantages over more complicated algorithms in certain situations.
 
=== Efficient sorts ===
The algorithm finds the minimum value, swaps it with the value in the first position, and repeats these steps for the remainder of the list. It does no more than ''n'' swaps, and thus is useful where swapping is very expensive.
Practical general sorting algorithms are almost always based on an algorithm with average time complexity (and generally worst-case complexity) O(''n'' log ''n''), of which the most common are heapsort, merge sort, and quicksort. Each has advantages and drawbacks, with the most significant being that simple implementation of merge sort uses O(''n'') additional space, and simple implementation of quicksort has O(''n''<sup>2</sup>) worst-case complexity. These problems can be solved or ameliorated at the cost of a more complex algorithm.
 
While these algorithms are asymptotically efficient on random data, for practical efficiency on real-world data various modifications are used. First, the overhead of these algorithms becomes significant on smaller data, so often a hybrid algorithm is used, commonly switching to insertion sort once the data is small enough. Second, the algorithms often perform poorly on already sorted data or almost sorted data – these are common in real-world data and can be sorted in O(''n'') time by appropriate algorithms. Finally, they may also be [[unstable sort|unstable]], and stability is often a desirable property in a sort. Thus more sophisticated algorithms are often employed, such as [[Timsort]] (based on merge sort) or [[introsort]] (based on quicksort, falling back to heapsort).
=== Insertion sort ===
{{Main|Insertion sort}}
''Insertion sort'' is a simple sorting algorithm that is relatively efficient for small lists and mostly sorted lists, and often is used as part of more sophisticated algorithms. It works by taking elements from the list one by one and inserting them in their correct position into a new sorted list. In arrays, the new list and the remaining elements can share the array's space, but insertion is expensive, requiring shifting all following elements over by one. [[Shell sort]] (see below) is a variant of insertion sort that is more efficient for larger lists.
 
==== ShellMerge sort ====
{{Main|Merge sort}}
''Merge sort'' takes advantage of the ease of merging already sorted lists into a new sorted list. It starts by comparing every two elements (i.e., 1 with 2, then 3 with 4...) and swapping them if the first should come after the second. It then merges each of the resulting lists of two into lists of four, then merges those lists of four, and so on; until at last two lists are merged into the final sorted list.<ref>{{harvnb|Wirth|1986|pp=101–102}}</ref> Of the algorithms described here, this is the first that scales well to very large lists, because its worst-case running time is O(''n'' log ''n''). It is also easily applied to lists, not only arrays, as it only requires sequential access, not random access. However, it has additional O(''n'') space complexity and involves a large number of copies in simple implementations.
 
Merge sort has seen a relatively recent surge in popularity for practical implementations, due to its use in the sophisticated algorithm [[Timsort]], which is used for the standard sort routine in the programming languages [[Python (programming language)|Python]]<ref>{{cite web|url=http://svn.python.org/projects/python/trunk/Objects/listsort.txt|title=Tim Peters's original description of timsort|website=python.org|access-date=14 April 2018|archive-date=22 January 2018|archive-url=https://web.archive.org/web/20180122024335/http://svn.python.org/projects/python/trunk/Objects/listsort.txt|url-status=live}}</ref> and [[Java (programming language)|Java]] (as of [[JDK7]]<ref>{{cite web|url=http://cr.openjdk.java.net/~martin/webrevs/openjdk7/timsort/raw_files/new/src/share/classes/java/util/TimSort.java|title=OpenJDK's TimSort.java|website=java.net|access-date=14 April 2018|archive-date=14 August 2011|archive-url=https://web.archive.org/web/20110814013719/http://cr.openjdk.java.net/~martin/webrevs/openjdk7/timsort/raw_files/new/src/share/classes/java/util/TimSort.java|url-status=dead}}</ref>). Merge sort itself is the standard routine in [[Perl]],<ref>{{cite web|url=http://perldoc.perl.org/functions/sort.html|title=sort – perldoc.perl.org|website=perldoc.perl.org|access-date=14 April 2018|archive-date=14 April 2018|archive-url=https://web.archive.org/web/20180414233802/http://perldoc.perl.org/functions/sort.html|url-status=live}}</ref> among others, and has been used in Java at least since 2000 in [[Java version history#J2SE 1.3|JDK1.3]].<ref name="mergesort_in_jdk13">[http://java.sun.com/j2se/1.3/docs/api/java/util/Arrays.html#sort(java.lang.Object%5B%5D) Merge sort in Java 1.3], Sun. {{Webarchive|url=https://web.archive.org/web/20090304021927/http://java.sun.com/j2se/1.3/docs/api/java/util/Arrays.html#sort(java.lang.Object%5B%5D) |date=2009-03-04 }}</ref>
[[File:Shellsort-edited.png|thumb|right|A Shell sort, different from bubble sort in that it moves elements numerous positions [[Swap (computer science)|swapping]] ]]
{{Main|Shell sort}}
 
==== Heapsort ====
''Shell sort'' was invented by [[Donald Shell]] in 1959. It improves upon bubble sort and insertion sort by moving out of order elements more than one position at a time. One implementation can be described as arranging the data sequence in a two-dimensional array and then sorting the columns of the array using insertion sort.
{{Main|Heapsort}}
''Heapsort'' is a much more efficient version of [[selection sort]]. It also works by determining the largest (or smallest) element of the list, placing that at the end (or beginning) of the list, then continuing with the rest of the list, but accomplishes this task efficiently by using a data structure called a [[heap (data structure)|heap]], a special type of [[binary tree]].<ref>{{harvnb|Wirth|1986|pp=87–89}}</ref> Once the data list has been made into a heap, the root node is guaranteed to be the largest (or smallest) element. When it is removed and placed at the end of the list, the heap is rearranged so the largest element remaining moves to the root. Using the heap, finding the next largest element takes O(log ''n'') time, instead of O(''n'') for a linear scan as in simple selection sort. This allows Heapsort to run in O(''n'' log ''n'') time, and this is also the worst-case complexity.
 
==== CombRecombinant sort ====
Recombinant sort is a non-comparison-based sorting algorithm developed by Peeyush Kumar et al in 2020. The algorithm combines bucket sort, counting sort, radix sort, hashing, and dynamic programming techniques. It employs an n-dimensional Cartesian space mapping approach consisting of two primary phases: a Hashing cycle that maps elements to a multidimensional array using a special hash function, and an Extraction cycle that retrieves elements in sorted order. Recombinant Sort achieves O(n) time complexity for best, average, and worst cases, and can process both numerical and string data types, including mixed decimal and non-decimal numbers.<ref>{{citation |url=https://www.iieta.org/journals/isi/paper/10.18280/isi.250513 |title=Recombinant Sort: N-Dimensional Cartesian Spaced Algorithm Designed from Synergetic Combination of Hashing, Bucket, Counting and Radix Sort|date=2020 |doi=10.18280/isi.250513 |last1=Kumar |first1=Peeyush |last2=Gangal |first2=Ayushe |last3=Kumari |first3=Sunita |journal=Ingénierie des Systèmes D Information |volume=25 |issue=5 |pages=655–668 |arxiv=2107.01391 }}</ref>
{{Main|Comb sort}}
''Comb sort'' is a relatively simplistic sorting algorithm originally designed by [[Wlodzimierz Dobosiewicz]] in 1980. Later it was rediscovered and popularized by [[Stephen Lacey (computer scientist)|Stephen Lacey]] and [[Richard Box]] with a [[Byte Magazine]] article published in April 1991. Comb sort improves on [[bubble sort]], and rivals algorithms like [[Quicksort]]. The basic idea is to eliminate ''turtles'', or small values near the end of the list, since in a bubble sort these slow the sorting down tremendously. (''Rabbits'', large values around the beginning of the list, do not pose a problem in bubble sort)
 
==== Merge sortQuicksort ====
{{Main|Merge sortQuicksort}}
''Quicksort'' is a [[divide-and-conquer algorithm]] which relies on a ''partition'' operation: to partition an array, an element called a ''pivot'' is selected.<ref>{{harvnb|Wirth|1986|p=93}}</ref><ref>{{citation |last1=Cormen |first1=Thomas H. |author1-link=Thomas H. Cormen |last2=Leiserson |first2=Charles E. |author2-link=Charles E. Leiserson |last3=Rivest |first3=Ronald L. |author3-link=Ron Rivest |last4=Stein |first4=Clifford |author4-link=Clifford Stein |title=Introduction to Algorithms |edition=3rd |place=Cambridge, MA |publisher=The MIT Press |year=2009 |isbn=978-0262033848 |pages=171–172}}</ref> All elements smaller than the pivot are moved before it and all greater elements are moved after it. This can be done efficiently in linear time and [[in-place]]. The lesser and greater sublists are then recursively sorted. This yields an average time complexity of O(''n'' log ''n''), with low overhead, and thus this is a popular algorithm. Efficient implementations of quicksort (with in-place partitioning) are typically unstable sorts and somewhat complex but are among the fastest sorting algorithms in practice. Together with its modest O(log ''n'') space usage, quicksort is one of the most popular sorting algorithms and is available in many standard programming libraries.
''Merge sort'' takes advantage of the ease of merging already sorted lists into a new sorted list. It starts by comparing every two elements (i.e., 1 with 2, then 3 with 4...) and swapping them if the first should come after the second. It then merges each of the resulting lists of two into lists of four, then merges those lists of four, and so on; until at last two lists are merged into the final sorted list. Of the algorithms described here, this is the first that scales well to very large lists, because its worst-case running time is O(''n'' log ''n''). Merge sort has seen a relatively recent surge in popularity for practical implementations, being used for the standard sort routine in the programming languages [[Perl]],<ref>[http://perldoc.perl.org/functions/sort.html Perl sort documentation]</ref> [[Python (programming language)|Python]] (as [[timsort]]<ref>[http://svn.python.org/projects/python/trunk/Objects/listsort.txt Tim Peters's original description of timsort]</ref>), and [[Java (programming language)|Java]] (also uses timsort as of [[JDK7]]<ref>http://hg.openjdk.java.net/jdk7/tl/jdk/rev/bfd7abda8f79</ref>), among others. Merge sort has been used in Java at least since 2000 in JDK1.3.<ref name = "mergesort_in_jdk13">[http://java.sun.com/j2se/1.3/docs/api/java/util/Arrays.html#sort(java.lang.Object%5B%5D) Merge sort in Java 1.3], Sun.</ref><ref name = "jdk13_since_2000">[[Java version history#J2SE 1.3 (May 8, 2000)|Java 1.3 live since 2000]]</ref>
 
The important caveat about quicksort is that its worst-case performance is O(''n''<sup>2</sup>); while this is rare, in naive implementations (choosing the first or last element as pivot) this occurs for sorted data, which is a common case. The most complex issue in quicksort is thus choosing a good pivot element, as consistently poor choices of pivots can result in drastically slower O(''n''<sup>2</sup>) performance, but good choice of pivots yields O(''n'' log ''n'') performance, which is asymptotically optimal. For example, if at each step the [[median]] is chosen as the pivot then the algorithm works in O(''n''&nbsp;log&nbsp;''n''). Finding the median, such as by the [[median of medians]] [[selection algorithm]] is however an O(''n'') operation on unsorted lists and therefore exacts significant overhead with sorting. In practice choosing a random pivot almost certainly yields O(''n''&nbsp;log&nbsp;''n'') performance.
=== Heapsort===
{{Main|Heapsort}}
''Heapsort'' is a much more efficient version of [[selection sort]]. It also works by determining the largest (or smallest) element of the list, placing that at the end (or beginning) of the list, then continuing with the rest of the list, but accomplishes this task efficiently by using a data structure called a [[heap (data structure)|heap]], a special type of [[binary tree]]. Once the data list has been made into a heap, the root node is guaranteed to be the largest (or smallest) element. When it is removed and placed at the end of the list, the heap is rearranged so the largest element remaining moves to the root. Using the heap, finding the next largest element takes ''O(''log'' n)'' time, instead of ''O(n)'' for a linear scan as in simple selection sort. This allows Heapsort to run in ''O(n ''log'' n)'' time, and this is also the worst case complexity.
 
If a guarantee of O(''n'' log ''n'') performance is important, there is a simple modification to achieve that. The idea, due to Musser, is to set a limit on the maximum depth of recursion.<ref>{{citation |last1=Musser |first1=David R. |title=Introspective Sorting and Selection Algorithms |journal=Software: Practice and Experience |year=1997 |volume=27 |issue=8 |pages=983–993|doi=10.1002/(SICI)1097-024X(199708)27:8<983::AID-SPE117>3.0.CO;2-# }}</ref> If that limit is exceeded, then sorting is continued using the heapsort algorithm. Musser proposed that the limit should be <math> 1 + 2 \lfloor \log_2(n) \rfloor</math>, which is approximately twice the maximum recursion depth one would expect on average with a randomly [[ordered array]].
=== Quicksort ===
{{Main|Quicksort}}
''Quicksort'' is a [[divide and conquer algorithm|divide and conquer]] [[algorithm]] which relies on a ''partition'' operation: to partition an array an element called a ''pivot'' is selected. All elements smaller than the pivot are moved before it and all greater elements are moved after it. This can be done efficiently in linear time and [[in-place algorithm|in-place]]. The lesser and greater sublists are then recursively sorted. Efficient implementations of quicksort (with in-place partitioning) are typically unstable sorts and somewhat complex, but are among the fastest sorting algorithms in practice. Together with its modest O(log ''n'') space usage, quicksort is one of the most popular sorting algorithms and is available in many standard programming libraries. The most complex issue in quicksort is choosing a good pivot element; consistently poor choices of pivots can result in drastically slower O(''n''²) performance, if at each step the [[median]] is chosen as the pivot then the algorithm works in O(''n''&nbsp;log&nbsp;''n''). Finding the median however, is an O(n) operation on unsorted lists and therefore exacts its own penalty with sorting.
 
==== Counting sortShellsort ====
[[File:Shell_sorting_algorithm_color_bars.svg|right|thumb|A Shellsort, different from bubble sort in that it moves elements to numerous [[Swap (computer science)|swapping positions]].]]
{{Main|Shellsort}}
''Shellsort'' was invented by [[Donald Shell]] in 1959.<ref name="Shell">{{Cite journal
|url=http://penguin.ewu.edu/cscd300/Topic/AdvSorting/p30-shell.pdf
|last=Shell
|first=D. L.
|title=A High-Speed Sorting Procedure
|journal=Communications of the ACM
|volume=2
|issue=7
|year=1959
|pages=30–32
|doi=10.1145/368370.368387
|s2cid=28572656
|access-date=2020-03-23
|archive-date=2017-08-30
|archive-url=https://web.archive.org/web/20170830020037/http://penguin.ewu.edu/cscd300/Topic/AdvSorting/p30-shell.pdf
|url-status=dead
}}</ref> It improves upon insertion sort by moving out of order elements more than one position at a time. The concept behind Shellsort is that insertion sort performs in {{tmath|O(kn)}} time, where k is the greatest distance between two out-of-place elements. This means that generally, they perform in ''O''(''n''<sup>2</sup>), but for data that is mostly sorted, with only a few elements out of place, they perform faster. So, by first sorting elements far away, and progressively shrinking the gap between the elements to sort, the final sort computes much faster. One implementation can be described as arranging the data sequence in a two-dimensional array and then sorting the columns of the array using insertion sort.
 
The worst-case time complexity of Shellsort is an [[open problem]] and depends on the gap sequence used, with known complexities ranging from ''O''(''n''<sup>2</sup>) to ''O''(''n''<sup>4/3</sup>) and Θ(''n'' log<sup>2</sup> ''n''). This, combined with the fact that Shellsort is [[in-place]], only needs a relatively small amount of code, and does not require use of the [[call stack]], makes it is useful in situations where memory is at a premium, such as in [[embedded system]]s and [[operating system kernel]]s.
 
=== Bubble sort and variants ===
Bubble sort, and variants such as the [[Comb sort]] and [[cocktail sort]], are simple, highly inefficient sorting algorithms. They are frequently seen in introductory texts due to ease of analysis, but they are rarely used in practice.
 
==== Bubble sort ====
[[File:Bubblesort-edited-color.svg|thumb|right|A bubble sort, a sorting algorithm that continuously steps through a list, [[Swap (computer science)|swapping]] items until they appear in the correct order.]]
{{Main|Bubble sort}}
 
''Bubble sort'' is a simple sorting algorithm. The algorithm starts at the beginning of the data set. It compares the first two elements, and if the first is greater than the second, it swaps them. It continues doing this for each pair of adjacent elements to the end of the data set. It then starts again with the first two elements, repeating until no swaps have occurred on the last pass.<ref>{{harvnb|Wirth|1986|pp=81–82}}</ref> This algorithm's average time and worst-case performance is O(''n''<sup>2</sup>), so it is rarely used to sort large, unordered data sets. Bubble sort can be used to sort a small number of items (where its asymptotic inefficiency is not a high penalty). Bubble sort can also be used efficiently on a list of any length that is nearly sorted (that is, the elements are not significantly out of place). For example, if any number of elements are out of place by only one position (e.g. 0123546789 and 1032547698), bubble sort's exchange will get them in order on the first pass, the second pass will find all elements in order, so the sort will take only 2''n'' time.
 
==== Comb sort ====
{{Main|Comb sort}}
''Comb sort'' is a relatively simple sorting algorithm based on [[bubble sort]] and originally designed by Włodzimierz Dobosiewicz in 1980.<ref name=BB>{{Cite journal | doi = 10.1016/S0020-0190(00)00223-4| title = Analyzing variants of Shellsort| journal = [[Inf. Process. Lett.]]| volume = 79| issue = 5| pages = 223–227
| date = 15 September 2001| last1 = Brejová | first1 = B. }}</ref> It was later rediscovered and popularized by Stephen Lacey and Richard Box with a [[Byte Magazine|''Byte'' Magazine]] article published in April 1991. The basic idea is to eliminate ''turtles'', or small values near the end of the list, since in a bubble sort these slow the sorting down tremendously. (''Rabbits'', large values around the beginning of the list, do not pose a problem in bubble sort) It accomplishes this by initially swapping elements that are a certain distance from one another in the array, rather than only swapping elements if they are adjacent to one another, and then shrinking the chosen distance until it is operating as a normal bubble sort. Thus, if Shellsort can be thought of as a generalized version of insertion sort that swaps elements spaced a certain distance away from one another, comb sort can be thought of as the same generalization applied to bubble sort.
 
==== Exchange sort ====
''Exchange sort'' is sometimes confused with bubble sort, although the algorithms are in fact distinct.<ref>{{Cite web|url=https://www.codingunit.com/exchange-sort-algorithm|title=Exchange Sort Algorithm|website=CodingUnit Programming Tutorials|access-date=2021-07-10|archive-date=2021-07-10|archive-url=https://web.archive.org/web/20210710111758/https://www.codingunit.com/exchange-sort-algorithm|url-status=live}}</ref><ref>{{Cite web|url=https://mathbits.com/MathBits/Java/arrays/Exchange.htm|title=Exchange Sort|website=JavaBitsNotebook.com|access-date=2021-07-10|archive-date=2021-07-10|archive-url=https://web.archive.org/web/20210710111757/https://mathbits.com/MathBits/Java/arrays/Exchange.htm|url-status=live}}</ref> Exchange sort works by comparing the first element with all elements above it, swapping where needed, thereby guaranteeing that the first element is correct for the final sort order; it then proceeds to do the same for the second element, and so on. It lacks the advantage that bubble sort has of detecting in one pass if the list is already sorted, but it can be faster than bubble sort by a constant factor (one less pass over the data to be sorted; half as many total comparisons) in worst-case situations. Like any simple O(''n''<sup>2</sup>) sort it can be reasonably fast over very small data sets, though in general [[insertion sort]] will be faster.
 
=== Distribution sorts ===
{{see also|External sorting}}
''Distribution sort'' refers to any sorting algorithm where data is distributed from their input to multiple intermediate structures which are then gathered and placed on the output. For example, both [[bucket sort]] and [[flashsort]] are distribution-based sorting algorithms. Distribution sorting algorithms can be used on a single processor, or they can be a [[distributed algorithm]], where individual subsets are separately sorted on different processors, then combined. This allows [[external sorting]] of data too large to fit into a single computer's memory.
 
==== Counting sort ====
{{Main|Counting sort}}
Counting sort is applicable when each input is known to belong to a particular set, ''S'', of possibilities. The algorithm runs in O(|''S''| + ''n'') time and O(|''S''|) memory where ''n'' is the length of the input. It works by creating an integer array of size |''S''| and using the ''i''th bin to count the occurrences of the ''i''th member of ''S'' in the input. Each input is then counted by incrementing the value of its corresponding bin. Afterward, the counting array is looped through to arrange all of the inputs in order. This sorting algorithm cannot often cannot be used because ''S'' needs to be reasonably small for itthe algorithm to be efficient, but the algorithmit is extremely fast and demonstrates great asymptotic behavior as ''n'' increases. It also can be modified to provide stable behavior.
 
==== Bucket sort ====
{{Main|Bucket sort}}
Bucket sort is a [[divide -and -conquer algorithm|divide -and -conquer]] sorting algorithm that generalizes [[Countingcounting sort]] by partitioning an array into a finite number of buckets. Each bucket is then sorted individually, either using a different sorting algorithm, or by recursively applying the bucket sorting algorithm. A variation of this method called the single buffered count sort is faster than quicksort.{{Citation needed|date=October 2010}}
 
A bucket sort works best when the elements of the data set are evenly distributed across all buckets.
Due to the fact that bucket sort must use a limited number of buckets it is best suited to be used on data sets of a limited scope. Bucket sort would be unsuitable for data that have a lot of variation, such as social security numbers.
 
==== Radix sort ====
{{Main|Radix sort}}
''Radix sort'' is an algorithm that sorts numbers by processing individual digits. ''n'' numbers consisting of ''k'' digits each are sorted in O(''n'' · ''k'') time. Radix sort can process digits of each number either starting from the [[least significant digit]] (LSD) or starting from the [[most significant digit]] (MSD). The LSD algorithm first sorts the list by the least significant digit while preserving their relative order using a stable sort. Then it sorts them by the next digit, and so on from the least significant to the most significant, ending up with a sorted list. While the LSD radix sort requires the use of a stable sort, the MSD radix sort algorithm does not (unless stable sorting is desired). In-place MSD radix sort is not stable. It is common for the [[counting sort]] algorithm to be used internally by the radix sort. HybridA [[hybrid algorithm|hybrid]] sorting approach, such as using [[insertion sort]] for small bins, improves performance of radix sort significantly.
 
=== Distribution sort ===
''Distribution sort'' refers to any sorting algorithm where data are distributed from their input to multiple intermediate structures which are then gathered and placed on the output. For example, both [[bucket sort]] and [[flashsort]] are distribution based sorting algorithms.
 
=== Timsort ===
{{Main|Timsort}}
''Timsort'' finds runs in the data, creates runs with insertion sort if necessary, and then uses merge sort to create the final sorted list. It has the same complexity (O(nlogn)) in the average and worst cases, but with pre-sorted data it goes down to O(n).
 
== Memory usage patterns and index sorting ==
When the size of the array to be sorted approaches or exceeds the available primary memory, so that (much slower) disk or swap space must be employed, the memory usage pattern of a sorting algorithm becomes important, and an algorithm that might have been fairly efficient when the array fit easily in RAM may become impractical. In this scenario, the total number of comparisons becomes (relatively) less important, and the number of times sections of memory must be copied or swapped to and from the disk can dominate the performance characteristics of an algorithm. Thus, the number of passes and the localization of comparisons can be more important than the raw number of comparisons, since comparisons of nearby elements to one another happen at [[computer bus|system bus]] speed (or, with caching, even at [[Central Processing Unit|CPU]] speed), which, compared to disk speed, is virtually instantaneous.
 
For example, the popular recursive [[quicksort]] algorithm provides quite reasonable performance with adequate RAM, but due to the recursive way that it copies portions of the array it becomes much less practical when the array does not fit in RAM, because it may cause a number of slow copy or move operations to and from disk. In that scenario, another algorithm may be preferable even if it requires more total comparisons.
 
One way to work around this problem, which works well when complex records (such as in a [[relational database]]) are being sorted by a relatively small key field, is to create an index into the array and then sort the index, rather than the entire array. (A sorted version of the entire array can then be produced with one pass, reading from the index, but often even that is unnecessary, as having the sorted index is adequate.) Because the index is much smaller than the entire array, it may fit easily in memory where the entire array would not, effectively eliminating the disk-swapping problem. This procedure is sometimes called "tag sort".<ref>[http{{cite web|url=https://www.pcmag.com/encyclopedia_term/0,2542,t=tag%2C2542%2Ct%3Dtag+sort&i=52532,00%26i%3D52532%2C00.asp Definition of "|title=tag sort" accordingDefinition tofrom PC Magazine] Encyclopedia|website=Pcmag.com|access-date=14 April 2018|archive-date=6 October 2012|archive-url=https://web.archive.org/web/20121006015634/http://www.pcmag.com/encyclopedia_term/0%2C2542%2Ct%3Dtag+sort%26i%3D52532%2C00.asp|url-status=live}}</ref>
 
Another technique for overcoming the memory-size problem is using [[external sorting]], for example, one of the ways is to combine two algorithms in a way that takes advantagesadvantage of the strength of each to improve overall performance. For instance, the array might be subdivided into chunks of a size that will fit easily in RAM (say, athe fewcontents thousandof elements),each the chunkschunk sorted using an efficient algorithm (such as [[quicksort]] or [[heapsort]]), and the results merged asusing pera ''k''-way merge similar to that used in [[mergesortmerge sort]]. This is less efficientfaster than justperforming doingeither mergesortmerge insort or quicksort over the firstentire placelist.<ref>[[Donald Knuth]], but''[[The itArt requiresof lessComputer physicalProgramming]]'', RAMVolume (to3: be''Sorting practical)and thanSearching'', aSecond fullEdition. quicksortAddison-Wesley, on1998, the{{ISBN|0-201-89685-0}}, wholeSection array5.4: External Sorting, pp. 248–379.</ref><ref>[[Ellis Horowitz]] and [[Sartaj Sahni]], ''Fundamentals of Data Structures'', H. Freeman & Co., {{ISBN|0-7167-8042-9}}.</ref>
 
Techniques can also be combined. For sorting very large sets of data that vastly exceed system memory, even the index may need to be sorted using an algorithm or combination of algorithms designed to perform reasonably with [[virtual memory]], i.e., to reduce the amount of swapping required.
 
== Related algorithms ==
==Inefficient/humorous sorts==
Related problems include [[K-sorted sequence#Algorithms|approximate sorting]] (sorting a sequence to within a certain [[rank correlation|amount]] of the correct order), [[partial sorting]] (sorting only the ''k'' smallest elements of a list, or finding the ''k'' smallest elements, but unordered) and [[selection algorithm|selection]] (computing the ''k''th smallest element). These can be solved inefficiently by a total sort, but more efficient algorithms exist, often derived by generalizing a sorting algorithm. The most notable example is [[quickselect]], which is related to [[quicksort]]. Conversely, some sorting algorithms can be derived by repeated application of a selection algorithm; quicksort and quickselect can be seen as the same pivoting move, differing only in whether one recurses on both sides (quicksort, [[divide-and-conquer algorithm|divide-and-conquer]]) or one side (quickselect, [[decrease-and-conquer]]).
 
A kind of opposite of a sorting algorithm is a [[shuffling algorithm]]. These are fundamentally different because they require a source of random numbers. Shuffling can also be implemented by a sorting algorithm, namely by a random sort: assigning a random number to each element of the list and then sorting based on the random numbers. This is generally not done in practice, however, and there is a well-known simple and efficient algorithm for shuffling: the [[Fisher–Yates shuffle]].
Some algorithms are extremely slow compared to those discussed above, such as the [[Bogosort]] <math>O(n\cdot n!)</math> and the [[Stooge sort]] <math> O(n^{2.7})</math>.
 
Sorting algorithms are ineffective for finding an order in many situations. Usually, when elements have no reliable comparison function (crowdsourced preferences like [[voting systems]]), comparisons are very costly ([[group tournament ranking system|sports]]), or when it would be impossible to pairwise compare all elements for all criteria ([[ranking (information retrieval)|search engines]]). In these cases, the problem is usually referred to as ''ranking'' and the goal is to find the "best" result for some criteria according to probabilities inferred from comparisons or rankings. A common example is in chess, where players are ranked with the [[Elo rating system]], and rankings are determined by a [[:Category:Tournament systems|tournament system]] instead of a sorting algorithm.
 
There are sorting algorithms for a "noisy" (potentially incorrect) comparator and sorting algorithms for a pair of "fast and dirty" (i.e. "noisy") and "clean" comparators. This can be useful when the full comparison function is costly.<ref>{{Cite conference|first1=Xingjian|last1=Bai|first2=Christian|last2=Coester|conference=NeurIPS|year=2023|title=Sorting with Predictions|page=5|url=https://proceedings.neurips.cc/paper_files/paper/2023/hash/544696ef4847c903376ed6ec58f3a703-Abstract-Conference.html}}</ref>
 
== See also ==
* {{annotated link|Collation}}
* [[External sorting]]
* [[SortingK-sorted networksequence]]s (compare)
* {{annotated link|Schwartzian transform}}
* [[Cocktail sort]]
* {{annotated link|Search algorithm}}
* [[Collation]]
* {{annotated link|Quantum sort}}
* [[Schwartzian transform]]
* [[Shuffle#Shuffling algorithms|Shuffling algorithms]]
* [[Search algorithm]]s
 
== References ==
{{Reflist|30em}}
{{More footnotes|date=September 2009}}
{{Reflist}}
* [[D. E. Knuth]], ''[[The Art of Computer Programming]], Volume 3: Sorting and Searching''.
 
== ExternalFurther reading links==
* {{citation |last=Knuth |first=Donald E. |author-link=Donald Knuth |series=The Art of Computer Programming |volume=3 |title=Sorting and Searching |edition=2nd |place=Boston |publisher=Addison-Wesley |year=1998 |isbn=0-201-89685-0 |ref=none}}
* {{citation |last=Sedgewick |first=Robert |author-link=Robert Sedgewick (computer scientist) |chapter=Efficient Sorting by Computer: An Introduction |title=Computational Probability |___location=New York |publisher=Academic Press |year=1980 |isbn=0-12-394680-8 |pages=[https://archive.org/details/computationalpro00actu/page/101 101–130] |chapter-url=https://archive.org/details/computationalpro00actu/page/101 |ref = none}}
 
== External links ==
{{wikibooks|Algorithm implementation|Sorting|Sorting algorithms}}
{{wikibooks|A-level Mathematics|OCR/D1/Algorithms#Sorting Algorithms|Sorting algorithms}}
{{Commons category|Sort algorithms|Sorting algorithms}}
* [http://www.sorting-algorithms.com/ Sorting Algorithm Animations] - Graphical illustration of how different algorithms handle different kinds of data sets.
* {{webarchive |url=https://web.archive.org/web/20150303022622/http://www.sorting-algorithms.com/ |date=3 March 2015 |title=Sorting Algorithm Animations}}.
* [http://www.iti.fh-flensburg.de/lang/algorithmen/sortieren/algoen.htm Sequential and parallel sorting algorithms] - Explanations and analyses of many sorting algorithms.
* [https://www.iti.fh-flensburg.de/lang/algorithmen/sortieren/algoen.htm Sequential and parallel sorting algorithms] – Explanations and analyses of many sorting algorithms.
* [http://www.nist.gov/dads/ Dictionary of Algorithms, Data Structures, and Problems] - Dictionary of algorithms, techniques, common functions, and problems.
* [https://www.nist.gov/dads/ Dictionary of Algorithms, Data Structures, and Problems] – Dictionary of algorithms, techniques, common functions, and problems.
* [http://www.softpanorama.org/Algorithms/sorting.shtml Slightly Skeptical View on Sorting Algorithms] Discusses several classic algorithms and promotes alternatives to the [[quicksort]] algorithm.
* [https://www.softpanorama.org/Algorithms/sorting.shtml Slightly Skeptical View on Sorting Algorithms] – Discusses several classic algorithms and promotes alternatives to the [[quicksort]] algorithm.
* [https://www.youtube.com/watch?v=kPRA0W1kECg&t=0 15 Sorting Algorithms in 6 Minutes (Youtube)] – Visualization and "audibilization" of 15 Sorting Algorithms in 6 Minutes.
* [https://oeis.org/A036604 A036604 sequence in OEIS database titled "Sorting numbers: minimal number of comparisons needed to sort n elements"] – Performed by [[Ford–Johnson algorithm]].
* [https://arxiv.org/pdf/2505.11927 XiSort – External merge sort with symbolic key transformation] – A variant of merge sort applied to large datasets using symbolic techniques.
* [https://github.com/farukalpay/XiSort XiSort reference implementation] – C/C++ Library of the xisort algorithm in reference
* [https://www.youtube.com/watch?v=d2d0r1bArUQ&t=0 Sorting Algorithms Used on Famous Paintings (Youtube)] – Visualization of Sorting Algorithms on Many Famous Paintings.
* [https://coderslegacy.com/comparison-of-sorting-algorithms/ A Comparison of Sorting Algorithms] – Runs a series of tests of 9 of the main sorting algorithms using Python timeit and [[Google Colab]].
{{Algorithmic paradigms}}{{sorting}}
 
{{sorting}}
 
{{DEFAULTSORT:Sorting Algorithm}}
[[Category:Sorting algorithms| ]]
[[Category:Data processing]]
 
[[ar:خوارزمية ترتيب]]
[[az:Sıralama alqoritmi]]
[[bg:Алгоритъм за сортиране]]
[[ca:Algorisme d'ordenació]]
[[cs:Řadicí algoritmus]]
[[da:Sorteringsalgoritme]]
[[de:Sortierverfahren]]
[[et:Sortimisalgoritm]]
[[es:Algoritmo de ordenamiento]]
[[fa:الگوریتم مرتب‌سازی]]
[[fr:Algorithme de tri]]
[[ko:정렬 알고리즘]]
[[hy:Տեսակավորման ալգորիթմ]]
[[is:Röðunarreiknirit]]
[[it:Algoritmo di ordinamento]]
[[he:מיון (מדעי המחשב)]]
[[kk:Мәліметтерді сұрыптау]]
[[ku:Algorîtmayê rêzkerdişî]]
[[lv:Datu šķirošanas algoritmi]]
[[lb:Zortéieralgorithmus]]
[[lt:Rikiavimo algoritmas]]
[[hu:Rendezés (programozás)]]
[[nl:Sorteeralgoritme]]
[[ja:ソート]]
[[no:Sorteringsalgoritme]]
[[mhr:Ойыркалымаш]]
[[pl:Sortowanie]]
[[pt:Algoritmo de ordenação]]
[[ru:Алгоритм сортировки]]
[[sk:Triediaci algoritmus]]
[[sl:Algoritmi za urejanje podatkov]]
[[sr:Алгоритми сортирања]]
[[fi:Lajittelualgoritmi]]
[[sv:Sorteringsalgoritm]]
[[ta:வரிசையாக்கப் படிமுறை]]
[[th:ขั้นตอนวิธีการเรียงลำดับ]]
[[tr:Sıralama algoritması]]
[[uk:Алгоритм сортування]]
[[vi:Thuật toán sắp xếp]]
[[zh:排序算法]]