Content deleted Content added
m Signing comment by 2605:6000:1522:400B:10FB:405:731E:8F1B - "→How bad is this code: " |
→Usage of bigO notation. "At least O(...)": clarification |
||
Line 725:
{{Reflist-talk}}
*{{cite book|ref=harv|last1=Knuth|first1=Donald|title=The Art of Computer Programming, Volume 3: Sorting and Searching|date=1998|publisher=Addison-Wesley Professional|edition=2nd}}
: Hello {{Ping|Glrx}}
: I've corrected again O with Ω. Reading what has been written on this talk page, I think clarification is needed.
: First, I hope we all agree that "A has a fundamental requirement of B" has the same meaning than "A requires B" (unless you want to do some nasty knowledge obfuscation).
: Second, the sentences "Comparison sorting algorithms have a fundamental requirement of Ω(n log n) comparisons" or "Comparison sorting algorithms have a fundamental requirement of O(n log n) comparisons" are ambiguous since it doesn't say for which complexity measure. Let us denote X a symbol in {O, Ω}. Unambiguous sentences would be "Comparison sorting algorithms require X(n log n) comparisons for worst case complexity." or "Comparison sorting algorithms require X(n log n) comparisons for best case complexity." or "Comparison sorting algorithms require X(n log n) comparisons for average complexity.", etc. (Average complexity is just the expected complexity for the uniform distribution over all instances of size n. You can have "expected complexity" for other probability distributions.) However, in complexity theory, when the complexity measure is not given explicitly, it is assumed most of the time that it is the worst case complexity.
: Third, the choice of the complexity measure (worst case, best case, average, etc.) is independent of the choice of the asymptotic notation ; you can have theorems with O and worst case complexity, with Ω and worst case complexity, with O and best case complexity, with Ω and best case complexity, with O and average complexity, with Ω and average complexity, etc. Please don't assume that worst case complexity is some kind of negation and that Ω is also some kind of negation, like when you say "dont worst-worst the notation" ; it isn't double negation that cancels out or anything similar.
: Fourth, if you want to upper bound worst case complexity, you have to prove that all instances are below the bound. If you want to lower bound worst case complexity, you just have to prove that some instance is above the bound. It's pure logic.
: Fifth, [[User:Trimutius|Trimutius]] said "I'm not really sure what the talk is about? It is pretty obvious that in all referenced literature they use Big O Notation for a particular reason. Because Ω just isn't really important.". It is true that O is much more frequently encountered than Ω but it isn't "Because Ω just isn't really important.". The fundamental reason for that is that most of the scientific articles present a result on the complexity of a problem ; hence to give an upper bound on the complexity of a problem, it is sufficient to find SOME algorithm for the problem and upper bound the complexity of this algorithm. But if you want to show that the complexity of the problem is lower bounded, you have to prove that ALL possible algorithms for this problem have this lower bound. This is a difference between "there exists one algorithm" and "for all algorithms" that makes it much more harder in general to prove lower bounds for complexity of problems. The effort to lower bound the complexity of the algorithm is usually not done, since they cannot conclude anything with it for the problem, which is the ultimate goal, and also since most of the time the only easy-to-prove lower bound an algorithm has is the trivial lower bound of Ω(n) (the algorithm must read the input). It is expected by most computer scientist that P doesn't equal NP but that is extremely hard to prove because it implies proving a superpolynomial lower bound for some NP problem and thus having to deal with all possible algorithms for this problem. The lower bound for "Comparison sorting algorithms" is really an exception because it simply uses decision trees to be proved. No not-trivial lower bound is known for ALL sorting algorithms. In general, O results are much more easier to prove than Ω results.
: Sixth, so now consider the truth value of the four sentences: S1 = "Comparison sorting algorithms require O(n log n) comparisons for worst case complexity.", S2 = "Comparison sorting algorithms require O(n log n) comparisons for best case complexity.", S3 = "Comparison sorting algorithms require Ω(n log n) comparisons for worst case complexity." or S4 = "Comparison sorting algorithms require Ω(n log n) comparisons for best case complexity.". We can translate the mathematical asymptotic notation as follows: S1 = "ALL comparison sorting algorithms require, asymptotically, up to a constant factor, at most n log n comparisons for worst case complexity.", S2 = "ALL comparison sorting algorithms require, asymptotically, up to a constant factor, at most n log n comparisons for best case complexity.", S3 = "ALL comparison sorting algorithms require, asymptotically, up to a constant factor, at least n log n comparisons for worst case complexity." or S4 = "ALL comparison sorting algorithms require, asymptotically, up to a constant factor, at least n log n comparisons for best case complexity.". S1 and S2 are false since you can design poorly some very bad comparison sorting algorithm that will repeat an arbitrary high number of times, say n^n, the comparison between the first and second element before proceeding to do a merge sort for example and ultimately solving the problem of sorting. With this poor algorithm the best and worst case will not be too far apart compared to the order of magnitude of their values, and way above nlogn. On the other hand, S3 is true, that's the classical result using decision trees. S4 is false since some comparison sorting algorithm may use only n-1 comparisons on already sorted input which would be the best case. Hence the only correct sentence is "Comparison sorting algorithms require Ω(n log n) comparisons for worst case complexity.". I'll let you check by yourself that "Comparison sorting algorithms require O(n log n) comparisons for whatever complexity." is also false (you can always make a poor algorithm).
: I hope I helped at least someone on this topic. Ask questions if it wasn't clear but be aware that I'm not always connected to Wikipedia so the answer may take some time. Best regards, [[User:SectionFinale|SectionFinale]] ([[User talk:SectionFinale|talk]]) 15:56, 17 December 2017 (UTC)
== NEW - release of hsort (triangulation sort) and why it matters ==
|