WEEK 12
7.4 Shellsort
- Define an increment sequence \(h_1 < h_2 < \cdots < h_t ( h_1 = 1 )\)
-
Define an \(h_k\)-sort at each phase for \(k = t, t - 1,\cdots, 1\)
-
最后一轮就是Insertion Sort
Note : An \(h_k\)-sorted file that is then \(h_{k-1}\)-sorted remains \(h_k\)-sorted.
Shell’s Increment Sequence
\[
h_t=\lfloor N/2\rfloor,h_k=\lfloor h_{k+1}/2\rfloor
\]
- [Theorem] The worst-case running time of Shellsort, using Shell’s increments, is \(\Theta( N^2 )\).
Hibbard's Increment Sequence
\[
h_k=2^k-1
\]
- [Theorem] The worst-case running time of Shellsort, using Hibbard's increments, is \(\Theta( N^{3/2} )\).
Conjecture
- \(T_{avg – Hibbard} ( N ) = O ( N^{5/4} )\)
- Sedgewick’s best sequence is \(\{1, 5, 19, 41, 109, \cdots \}\) in which the terms are either of the form \(9\times4^i – 9\times2^i + 1\) or \(4^i – 3\times2^i + 1\). \(T_{avg} ( N ) = O ( N^{7/6} )\) and \(T_{worst}( N ) = O( N^{4/3} )\).
Conclusion
- Shellsort is a very simple algorithm, yet with an extremely complex analysis.
- It is good for sorting up to moderately large input (tens of thousands).
7.5 Heapsort
Algorithm1
C | |
---|---|
\[
T(N)=O(N\log N)
\]
- The space requirement is doubled.
Algorithm2
C | |
---|---|
- [Theorem] The average number of comparisons used to heapsort a random permutation of N distinct items is \(2N\log N-O(N\log\log N)\).
Note : Although Heapsort gives the best average time, in practice it is slower than a version of Shellsort that uses Sedgewick’s increment sequence.
7.6 Mergesort
- If a TmpArray is declared locally for each call of Merge, then \(S(N) = O(N\log N)\).
Analysis
\[
T(1)=O(1)\\
T(N)=2T(\frac{N}{2})+O(N)\\
\frac{T(N)}{N}=\frac{T(\frac{N}{2})}{\frac{N}{2}}+1\\
\cdots\\
\frac{T(\frac{N}{2^{k-1}})}{\frac{N}{2^{k-1}}}=\frac{T(1)}{1}+1\\
T(N)=O(N+N\log N)
\]
Note : Mergesort requires linear extra memory, and copying an array is slow. It is hardly ever used for internal sorting, but is quite useful for external sorting.