On Thu, Jul 26, 2001 at 08:45:17PM -0400, Bernie Cosell wrote:
>
> A tricky design decision is whether to go with a sort with better
> average performance in exchange for worse worst-case performance
> [e.g., quick sort and quickersort, that are O(N^2) worst case, if I
> remember long-ago comp science classes right, but are *usually*
> better than nlogn]. Another situation like this is a list-merge sort
> which is *fantastic* if the data is nearly sorted...
No, they are not usually better than N log N. You can prove that in
general you cannot sort faster than N log N. Period. Quicksort is
expected (don't use the term 'average' here, that's too vague)
O (N log N), but Theta (N^2) worst case.
Only when you have restricted domains, you may be able to sort faster.
For instance, sorting N integers in the range 1 .. U can be done in
O (N + U) time and N strings from an alphabet of U letters and of
length k can be done in O (N * k + U). (Note that sorting N strings
of length k using traditional techniques takes O (N * log N * k), the
fact that you don't see the 'k' usually comes that often 'k' is taken
to be a constant - of course, if we take 'k' a constant, I can sort
ASCII strings in O (N) time....)
Abigail