Although the complex in case of bad resolution of QuickSort is O(N2). However QuickSort usually is chosen in the practical application, explain why?
Because on average it's the fastest comparison sort (in terms of elapsed time).
Because, in the general case, it's one of the fastest sorting algorithms.
Average asymptotic order of QuickSort is O(nlogn)
and it's usually more efficient than heapsort due to smaller constants (tighter loops). In fact, there is a theoretical linear time median selection algorithm that you can use to always find the best pivot, thus resulting a worst case O(nlogn)
. However, the normal QuickSort is usually faster than this theoretical one.
To make it more sensible, consider the probability that QuickSort will finish in O(n
2
)
. It's just 1/n!
which means it'll almost never encounter that bad case.
You shouldn't center only on worst case and only on time complexity. It's more about average than worst, and it's about time and space.
Quicksort:
- has average time complexity of Θ(n log n);
- can be implemented with space complexity of Θ(log n);
Also have in account that big O notation doesn't take in account any constants, but in practice it does make difference if the algorithm is few times faster. Θ(n log n) means, that algorithm executes in K n log(n), where K is constant. Quicksort is the comparison-sort algorithm with the lowest K.
It might be worth pointing out that C does have the library function qsort()
, but there's no requirement that it be implemented using an actual QuickSort, that is up to the compiler vendor.
Whenever i hear of sorting times i always think of the following web-site which illustrates perfectly the why: