DefIn Place
A sorting algorithm is in place if the rearrangement of the elements happens inside the array: does not require additional memory, except some memory of constant size.
Definition
Def Stable
A sorting algorithm is stable if elements with the same values appear in the output array in the same order as they appear in the input array.
Insertion Sort
Merge Sort
RandQuick Sort
Time
Worst:
Average:
Worst:
Worst:
Average:
In Place
Yes
No
Yes
Stable
Yes
Yes
No
FactMerge Sort vs RandQuick Sort
In practice, RandQuick Sort often performs close to average than worst case, and it is faster than Merge Sort.
For problems where time guarantee is strict, Merge Sort would be better because even in the worst case, it is still in .
When memory is limited, RandQuick Sort would be better, as it is in place.
When previous ordering matters, RandQuick Sort is problematic because it is unstable.
Theorem
ThrmComparison-Based Algorithms Theorem
Any comparison-based sorting algorithm requires comparisons in the worst case. Hence, its time complexity must be in .
Proof Let’s first represent possible comparisons that might happen in sorting any permutation of distinct numbers as a decision tree:
In the worst case, the number of comparisons a comparison- based sorting algorithm performs would be the same as the number of levels in the tree.
The decision tree has leaf nodes. A tree with levels has at most leaves. Thus
Since , we can get .
Therefore, any comparison-based sorting algorithm requires comparisons in the worst case.
Counting Sort
Algorithm
AlgorithmCounting Sort
Counting Sort is designed for sorting integers from a small range. Assume we want to sort when is an integer between and . If is large but the range is small, there must be many elements with the same value. Counting sort uses this observation to sort the elements by counting elements of the same values.
Construct a temporary array of size , where the indices of are the values of the input array and is the number of times the value appears in the input array.
Use to identify the right position for each element of the input array.
Define array of size , where is the number of elements equal or smaller than in .
Output by assigning values of at index based on values of and update , starting from last element of .
CoutingSort(A)M <- the max value in AC = new Array(M+1)for j = 1 to n C[A[j]]++S = new Array(M+1)S[0] = C[0]for i = 1 to M S[i] = S[i-1] + C[i]O = new Array(n-1)for j = n to 1 O[S[A[j]]] = A[j] S[A[j]]--return O
Proposition
Prop Counting sort is not in place, but is stable.
Proposition
Prop The time complexity of counting sort is in where is the input size and the input ranges from to . Since Counting Sort is often used for problems where is much smaller than , the complexity is often stated as . However, if Counting Sort is applied to problems where the range of numbers is much larger than 𝑛, then the complexity is .
Bucket Sort
AlgorithmBucket Sort
Bucket sort is mainly for sorting numbers in the interval that are drawn independently from a uniform distribution.The idea is to assign similar elements to the same bucket, and then sort elements within each bucket.
BucketSort(A)n = A.lengthB[1,...,n] <- array of empty linkedlistfor i = 1 to n insert A[i] into list B[floor(nA[i])]for i = 1 to n sort linkedlist B[i]concatenate lists B[i] in order
Prop The average time complexity of bucket sort is in .
Proof
Prop Bucket sort is not in place but stable.
Radix Sort
AlgorithmRadix Sort
The idea is to use a stable sort to sort the elements from the least to the most significant digit one by one.
RadixSort(A)d <- maximum number of digits in a number in Afor i = 1 to d stableSort(A) on digit i
Usually, Counting Sort is used inside the loop. The time complexity is in , assuming Counting Sort is used.