Lower bound for comparison sorting
With the model of decision tree, we can work out that the lower bound for the worst case of comparison sorting is Ω ( n log n ) \Omega(n\log n) Ω(nlogn).
The decision tree model of Insertionsort operated on 3 elements is shown as follows.
There are all n n n elements. The height of the decision tree is h h h. And the number of the leaf nodes is l l l.
Then, we know that all the permutations of the input elements are included as the leaf nodes of the decision tree. Moreover, the number of the leaf nodes is smaller than the one of the perfect binary tree. Therefore, we have: n ! ≤ l ≤ 2 h n!\leq l\leq 2^h n!≤l≤2h. Then, we can use Stirling Formula: n ! ∼ 2 π n n n e − n n! \sim \sqrt{2 \pi n}\;n^{n}\;e^{-n} n!∼2πnnne−n, then we get that: h ≥ log 2 π + 1 2 log n + n log n − n = Ω ( n log n ) h\geq \log{\sqrt{2\pi}}+\frac{1}{2}\log n+n\log n -n=\Omega(n\log n) h≥log2π+21logn+nlogn−n=Ω(nlogn).
So, the lower bound for the worst case of comparison sorting is Ω ( n log n ) \Omega (n\log n) Ω(nlogn).
Counting sort
In the past, emm, at least before I have learnt the algorithm, I think it’s unbelivable to sort an array in LINEAR time. But counting sort and Radix sort show me that they can do that. However, sorting in linear time has its cost! For Counting sort, it’s based on the assumption that the input numbers are between 0 and k k k and more cost of space. And then, just count and record the counting result in another array C[i].
Part of the codes are shown below.
void Countingsort(int *A, int *B, int k, int ALength)
{
int *C;
C = new int(k);
Refresh(C, k);
for (int i = 0; i < ALength; i++) {
C[A[i]]++;
}
Counting(C, k);
for (int i = ALength-1; i >= 0; i--) {
B[C[A[i]]-1] = A[i];
C[A[i]]--;
}
}
It’s worth noting that in Line 11, the loop is operated from ALength-1 to 0. Why not from 0 to ALength-1? It makes sure that the sorting is stable. Numbers with the same value appear in output array in the same order as the ones in the input array. This makes sure that Counting sort can be one part of Radix sort.
When k = Θ ( n ) k=\Theta(n) k=Θ(n), the time complexity for Counting sort is Θ ( n + k ) = Θ ( n ) \Theta(n+k)=\Theta(n) Θ(n+k)=Θ(n).
Radix sort
Radix sort is based on a stable sorting algorithm operated on every digit of elements.
For example, given 321, 123, 312, we can sort from the most significant digit, that is sort 3, 1, 3 first. Then we sort 321 and 312 from the second most significant digit, that is 2 and 1. So we get 321, 312, 123.
We often sort from the least significance digit.
There must be a stable sorting algorithm, or the Radix sort will not be available. Given n n n b b b-bit numbers and any positive integer r ≤ b r\leq b r≤b, Radix sort correctly sorts these numbers in Θ ( ( b / r ) ( n + 2 r ) ) \Theta((b/r)(n+2^r)) Θ((b/r)(n+2r)) time if the stable sort it uses takes Θ ( n + k ) \Theta(n+k) Θ(n+k) time for inputs in the range 0 to k k k.
For Radix sort, I only work out the code of sort 3-digit numbers. To get the complete code please visit my repositories: github.com.