4
$\begingroup$

What does it mean to say that

heap sort and merge sort are asymptotically optimal comparison sorts .

I know What the Big O , Big Omega($\omega)$ and Theta($\theta$) notations are and I also know why these two sorts are called comparison sorts ? But I am not able to understand why they are called asymptotically optimal ? Why doesn't this asymptotically optimal property also apply to quick sort ?

P.S: I had a look in this question but still I am not clear about the definition of asymptotically optimal .

  • 0
    Because the Landau symbols only give you asymptotic statements, that is in the limit for $n \to \infty$.2012-08-12

3 Answers 3

7

Computing a simple function that characterizes the behavior of an algorithm for any input is difficult. So upper and lower bounds are used, in addition to probabilistic methods.

The term 'asymptotic' is used because the bound is typically only valid for large $n$, as an accurate formula for small $n$ is more complicated, and not really interesting from a performance perspective.

To answer your question, you can prove that any working comparison sort takes a lower bound of $\Omega(n \lg n)$ on running time. Consequently, if you can show that a sorting algorithm (such as heapsort or merge sort) has an upper bound of $O(n \lg n)$ on running time, then in fact you have a running time of $\Theta(n \lg n)$ which is, by definition, an asymptotically tight bound for the running time (See Cormen, Leiserson & Rivest's "Introduction to Algorithms", Chapter 2, Theorem 2.1, and the section on sorting).

Quicksort has a worst-case running time of $\Theta(n^2)$, which is larger than the above bound, hence is not optimal.

Note, however, that quicksort has an average running time of $\Theta(n \lg n)$. (This, of course, assumes that all inputs of length $n$ are equally likely.)

  • 0
    Thanks for second paragraph which is clear enough!2014-11-13
2

It means that they can't be improved upon. It's fairly easy to prove that any comparison sort can't do better than $\Theta(n \log n)$, so any comparison sort which achieves $\Theta(n \log n)$ is asymptotically optimal.

Quick sort is $O(n^2)$ and that bound can't be tightened.

  • 0
    @Geek, depends on what you average over. And worst case is what's relevant unless otherwise specified.2012-08-11
1

It means the following: Let $A$ be a comparison based sorting algorithm, and denote $T_{A}(n$) the time complexity of $A$ then $T_{A}(n)=\Omega(nlogn)$

That is, there exist a constant $c>0$ and $n_{0}$ such that for every $n>n_{0}$ it holds that $T_{A}(n)\geq c\cdot nlogn$