2. خان سنور Algorithm Analysis
Partitioning and Sorting
Basic Idea of Quick Sort
Learning Goals
Analysis of Best and Average Case
Example
Contents
3. خان سنور Algorithm Analysis
Learning Goals
•After Learning This
• Be able to explain how quicksort works and what its complexity is
(worst-case, best-case, average case).
• Be able to compare quicksort with heapsort and mergesort.
• Be able to explain the advantages and disadvantages of quicksort,
and its applications.
4. خان سنور Algorithm Analysis
Introduction
• The basic version of quick sort algorithm was invented by C. A. R. Hoare in
1960 and formally introduced quick sort in 1962.
• It is used on the principle of divide-and-conquer.
• Quick sort is an algorithm of choice in many situations because it is not
difficult to implement, it is a good "general purpose" sort.
• It consumes relatively fewer resources during execution.
5. خان سنور Algorithm Analysis
Quicksort by Hoare (1962)
• Select a pivot (partitioning element)
• Rearrange the list so that all the elements in the positions before the pivot are smaller
than or equal to the pivot and those after the pivot are larger than or equal to the pivot
• Exchange the pivot with the last element in the first (i.e., ≤) sublist – the pivot is now
in its final position
• Sort the two sublists recursively
• Apply quicksort to sort the list 7 2 9 10 5 4
p
A[i]≤p A[i]p
6. خان سنور Algorithm Analysis
Features
• Similar to mergesort - divide-and-conquer recursive algorithm
• One of the fastest sorting algorithms
• Average running time O(NlogN)
• Worst-case running time O(N 2)
7. خان سنور Algorithm Analysis
Good Points
• It is in-place since it uses only a small auxiliary stack.
• It requires only n log(n) time to sort n items.
• It has an extremely short inner loop
8. خان سنور Algorithm Analysis
Bad Points
• It is recursive. Especially if recursion is not available, the implementation
is extremely complicated.
• It requires quadratic (i.e., n2) time in the worst-case.
• It is fragile i.e., a simple mistake in the implementation can go unnoticed
and cause it to perform badly.
9. خان سنور Algorithm Analysis
Description of Quick Sort
• Quick sort works by partitioning a given array A[p . . r] into two non-
empty sub array A[p . . q] and A[q+1 . . r] such that every key in A[p . . q] is
less than or equal to every key in A[q+1 . . r].
• Then the two subarrays are sorted by recursive calls to Quick sort.
• The exact position of the partition depends on the given array and
index q is computed as a part of the partitioning procedure.
10. خان سنور Algorithm Analysis
Quick Sort Design
• Follows the divide-and-conquer paradigm.
• Divide: Partition (separate) the array A[p..r] into two (possibly empty)
subarrays A[p..q–1] and A[q+1..r].
• Each element in A[p..q–1] < A[q].
• A[q] < each element in A[q+1..r].
• Index q is computed as part of the partitioning procedure.
• Conquer: Sort the two subarrays by recursive calls to quicksort.
• Combine: The subarrays are sorted in place – no work is needed to
combine them.
• How do the divide and combine steps of quicksort compare with those of
merge sort?
11. خان سنور Algorithm Analysis
Quick Sort:
• Like mergesort, Quicksort is also based on the divide-and-conquer
paradigm.
• But it uses this technique in a somewhat opposite manner,
as all the hard work is done before the recursive calls.
• It works as follows:
1. First, it partitions an array into two parts,
2. Then, it sorts the parts independently,
3. Finally, it combines the sorted subsequences by
a simple concatenation.
12. خان سنور Algorithm Analysis
Quick Sort (Cont.…)
The quick-sort algorithm consists of the following three steps:
1. Divide: Partition the list.
• To partition the list, we first choose some element from the list for which we hope
about half the elements will come before and half after. Call this element the pivot.
• Then we partition the elements so that all those with values less than the pivot
come in one sublist and all those with greater values come in another.
2. Recursion: Recursively sort the sublists separately.
3. Conquer: Put the sorted sublists together.
13. خان سنور Algorithm Analysis
Quick Sort Procedure
If p < r then
q PARTITION (A, p, r)
Recursive call to QUICKSORT(A, p, q)
Recursive call to QUICKSORT(A, q + r, r)
14. خان سنور Algorithm Analysis
Partition – Choosing the pivot
• First, we have to select a pivot element among the elements of the given
array, and we put this pivot into the first location of the array before
partitioning.
• Which array item should be selected as pivot?
• Somehow we have to select a pivot, and we hope that we will get a good
partitioning.
• If the items in the array arranged randomly, we choose a pivot randomly.
• We can choose the first or last element as a pivot (it may not give a good
partitioning).
• We can use different techniques to select the pivot.
15. خان سنور Algorithm Analysis
Partitioning The Array
PARTITION (A, p, r)
x ← A[p]
i ← p-1
j ← r+1
while TRUE do
Repeat j ← j-1
until A[j] ≤ x
Repeat i ← i+1
until A[i] ≥ x
if i < j
then exchange A[i] ↔ A[j]
else return j
5
A[p..r]
A[p..q – 1] A[q+1..r]
5 5
Partition
5
Partitioning procedure rearranges the subarrays in-place.
16. خان سنور Algorithm Analysis
How to Place
• Partition selects the first key, A[p] as a pivot key about which the array
will partitioned:
• Keys ≤ A[p] will be moved towards the left .
• Keys ≥ A[p] will be moved towards the right.
• The running time of the partition procedure is Θ(n) where n = r – p + 1
which is the number of keys in the array.
17. خان سنور Algorithm Analysis
Example
p r
initially: 2 5 8 3 9 4 1 7 10 6 note: pivot (x) = 6
i j
next iteration: 2 5 8 3 9 4 1 7 10 6
i j
next iteration: 2 5 8 3 9 4 1 7 10 6
i j
next iteration: 2 5 8 3 9 4 1 7 10 6
i j
next iteration: 2 5 3 8 9 4 1 7 10 6
i j
Partition(A, p, r)
x, i := A[r], p – 1;
for j := p to r – 1 do
if A[j] x then
i := i + 1;
A[i] A[j]
A[i + 1] A[r];
return i + 1
18. خان سنور Algorithm Analysis
Example (Continued)
next iteration: 2 5 3 8 9 4 1 7 10 6
i j
next iteration: 2 5 3 8 9 4 1 7 10 6
i j
next iteration: 2 5 3 4 9 8 1 7 10 6
i j
next iteration: 2 5 3 4 1 8 9 7 10 6
i j
next iteration: 2 5 3 4 1 8 9 7 10 6
i j
next iteration: 2 5 3 4 1 8 9 7 10 6
i j
after final swap: 2 5 3 4 1 6 9 7 10 8
i j
Partition(A, p, r)
x, i := A[r], p – 1;
for j := p to r – 1 do
if A[j] x then
i := i + 1;
A[i] A[j]
A[i + 1] A[r];
return i + 1
19. خان سنور Algorithm Analysis
Partitioning
• Select the last element A[r] in the subarray A[p..r] as the pivot – the
element around which to partition.
• As the procedure executes, the array is partitioned into four (possibly
empty) regions.
1. A[p..i ] — All entries in this region are < pivot.
2. A[i+1..j – 1] — All entries in this region are > pivot.
3. A[r] = pivot.
4. A[j..r – 1] — Not known how they compare to pivot.
• The above hold before each iteration of the for loop, and constitute a
loop invariant. (4 is not part of the loopi.)
20. خان سنور Algorithm Analysis
Correctness of Partition
• Use loop invariant.
• Initialization:
• Before first iteration
• A[p..i] and A[i+1..j – 1] are empty – Conds. 1 and 2 are satisfied
(trivially).
• r is the index of the pivot
• Cond. 3 is satisfied.
• Maintenance:
• Case 1: A[j] > x
• Increment j only.
• Loop Invariant is maintained.
Partition(A, p, r)
x, i := A[r], p – 1;
for j := p to r – 1 do
if A[j] x then
i := i + 1;
A[i] A[j]
A[i + 1] A[r];
return i + 1
21. خان سنور Algorithm Analysis
Correctness of Partition
>x x
p i j r
x > x
x
p i j r
x > x
Case 1:
22. خان سنور Algorithm Analysis
Correctness of Partition
• Case 2: A[j] x
• Increment i
• Swap A[i] and A[j]
• Condition 1 is maintained.
• Increment j
• Condition 2 is maintained.
• A[r] is unaltered.
• Condition 3 is maintained.
x x
p i j r
x > x
x > x
x
p i j r
23. خان سنور Algorithm Analysis
Correctness of Partition
• Termination:
• When the loop terminates, j = r, so all elements in A are partitioned into one of the
three cases:
• A[p..i] pivot
• A[i+1..j – 1] > pivot
• A[r] = pivot
• The last two lines swap A[i+1] and A[r].
• Pivot moves from the end of the array to between the two subarrays.
• Thus, procedure partition correctly performs the divide step.
24. خان سنور Algorithm Analysis
Complexity of Partition
• PartitionTime(n) is given by the number of iterations in the for loop.
• (n) : n = r – p + 1.
Partition(A, p, r)
x, i := A[r], p – 1;
for j := p to r – 1 do
if A[j] x then
i := i + 1;
A[i] A[j]
A[i + 1] A[r];
return i + 1
25. خان سنور Algorithm Analysis
Quicksort Overview
To sort a[left...right]:
1. if left < right:
1.1. Partition a[left...right] such that:
all a[left...p-1] are less than a[p], and
all a[p+1...right] are >= a[p]
1.2. Quicksort a[left...p-1]
1.3. Quicksort a[p+1...right]
2. Terminate
26. خان سنور Algorithm Analysis
Partitioning in Quicksort
• A key step in the Quicksort algorithm is partitioning the array
• We choose some (any) number p in the array to use as a pivot
• We partition the array into three parts:
p
numbers less
than p
numbers greater than or
equal to p
p
27. خان سنور Algorithm Analysis
Alternative Partitioning
• Choose an array value (say, the first) to use as the pivot
• Starting from the left end, find the first element that is greater than or
equal to the pivot
• Searching backward from the right end, find the first element that is less
than the pivot
• Interchange (swap) these two elements
• Repeat, searching from where we left off, until done
28. خان سنور Algorithm Analysis
Alternative Partitioning
To partition a[left...right]:
1. Set pivot = a[left], l = left + 1, r = right;
2. while l < r, do
2.1. while l < right & a[l] < pivot , set l = l + 1
2.2. while r > left & a[r] >= pivot , set r = r - 1
2.3. if l < r, swap a[l] and a[r]
3. Set a[left] = a[r], a[r] = pivot
4. Terminate
30. خان سنور Algorithm Analysis
Analysis of quicksort—best case
• Suppose each partition operation divides the array almost
exactly in half
• Then the depth of the recursion in log2n
• Because that’s how many times we can halve n
• We note that
• Each partition is linear over its subarray
• All the partitions at one level cover the array
32. خان سنور Algorithm Analysis
Best Case Analysis
• The best thing that could happen in Quick sort would be that each
partitioning stage divides the array exactly in half. In other words, the best
to be a median of the keys in A[p . . r] every time procedure 'Partition' is
called. The procedure 'Partition' always split the array to be sorted into
two equal sized arrays.
• If the procedure 'Partition' produces two regions of size n/2. the
recurrence relation is then
T(n) = T(n/2) + T(n/2) + Θ(n)
= 2T(n/2) + Θ(n)
• From case 2 of Master theorem
T(n) = Θ(n lg n)
33. خان سنور Algorithm Analysis
Best Case Analysis
• We cut the array size in half each time
• So the depth of the recursion in log2n
• At each level of the recursion, all the partitions at that level do work that
is linear in n
• O(log2n) * O(n) = O(n log2n)
• Hence in the best case, quicksort has time complexity O(n log2n)
• What about the worst case?
34. خان سنور Algorithm Analysis
Worst Case Partitioning
• The worst-case occurs if given array A[1 . . n] is already sorted. The
PARTITION (A, p, r) call always return p so successive calls to partition
will split arrays of length n, n-1, n-2, . . . , 2 and running time proportional
to n + (n-1) + (n-2) + . . . + 2 = [(n+2)(n-1)]/2 = Θ (n2). The worst-case
also occurs if A[1 . . n] starts out in reverse order.
35. خان سنور Algorithm Analysis
Worst case
• In the worst case, partitioning always divides the size n array into these
three parts:
• A length one part, containing the pivot itself
• A length zero part, and
• A length n-1 part, containing everything else
• We don’t recur on the zero-length part
• Recurring on the length n-1 part requires (in the worst case) recurring to
depth n-1
37. خان سنور Algorithm Analysis
Worst case for quicksort
• In the worst case, recursion may be n levels deep (for an array of size n)
• But the partitioning work done at each level is still n
• O(n) * O(n) = O(n2)
• So worst case for Quicksort is O(n2)
• When does this happen?
• There are many arrangements that could make this happen
• Here are two common cases:
• When the array is already sorted
• When the array is inversely sorted (sorted in the opposite order)
38. خان سنور Algorithm Analysis
Typical case for quicksort
• If the array is sorted to begin with, Quicksort is terrible: O(n2)
• It is possible to construct other bad cases
• However, Quicksort is usually O(n log2n)
• The constants are so good that Quicksort is generally the faster
algorithm.
• Most real-world sorting is done by Quicksort
39. خان سنور Algorithm Analysis
Picking a better pivot
• Before, we picked the first element of the subarray to use as a pivot
• If the array is already sorted, this results in O(n2) behavior
• It’s no better if we pick the last element
• We could do an optimal quicksort (guaranteed O(n log n)) if we always
picked a pivot value that exactly cuts the array in half
• Such a value is called a median: half of the values in the array are larger, half are
smaller
• The easiest way to find the median is to sort the array and pick the value in the
middle (!)
40. خان سنور Algorithm Analysis
Median of three
• Obviously, it doesn’t make sense to sort the array in order to find the
median to use as a pivot.
• Instead, compare just three elements of our (sub)array—the first, the last,
and the middle
• Take the median (middle value) of these three as the pivot
• It’s possible (but not easy) to construct cases which will make this technique O(n2)
41. خان سنور Algorithm Analysis
Quicksort for Small Arrays
• For very small arrays (N<= 20), quicksort does not perform as well as
insertion sort
• A good cutoff range is N=10
• Switching to insertion sort for small arrays can save about 15% in the
running time
42. خان سنور Algorithm Analysis
Mergesort vs Quicksort
• Both run in O(n lgn)
• Mergesort – always.
• Quicksort – on average
• Compared with Quicksort, Mergesort has less number of comparisons but
larger number of moving elements
• In Java, an element comparison is expensive but moving elements is cheap.
Therefore, Mergesort is used in the standard Java library for generic
sorting
43. خان سنور Algorithm Analysis
Mergesort vs Quicksort
In C++, copying objects can be expensive while comparing objects often is
relatively cheap. Therefore, quicksort is the sorting routine commonly
used in C++ libraries
Note these last two rules are not really language specific, but rather how the
language is typically used.
44. خان سنور Algorithm Analysis
Summary of Quicksort
• Best case: split in the middle — Θ( n log n)
• Worst case: sorted array! — Θ( n2)
• Average case: random arrays — Θ( n log n)
• Considered as the method of choice for internal sorting for large files (n ≥
10000)
• Improvements:
• better pivot selection: median of three partitioning avoids worst case in sorted files
• switch to insertion sort on small subfiles
• elimination of recursion
these combine to 20-25% improvement
45. خان سنور Algorithm Analysis
Conclusion
•Quick sort is an in place sorting algorithm whose
worst-case running time is Θ(n2) and expected
running time is Θ(n lg n) where constants hidden in
Θ(n lg n) are small.
46. خان سنور Algorithm AnalysisQuick-Sort 46
Summary of Sorting Algorithms
Algorithm Time Notes
selection-sort O(n2)
in-place
slow (good for small inputs)
insertion-sort O(n2)
in-place
slow (good for small inputs)
quick-sort
O(n log n)
expected
in-place, randomized
fastest (good for large inputs)
heap-sort O(n log n)
in-place
fast (good for large inputs)
merge-sort O(n log n)
sequential data access
fast (good for huge inputs)
47. خان سنور Algorithm Analysis
Exam Like Question
• Briefly describe the basic idea of quicksort.
• What is the complexity of quicksort?
• Analyze the worst-case complexity solving the recurrence relation.
• Analyze the best-case complexity solving the recurrence relation.
• Compare quicksort with mergesort and heapsort.
• What are the advantages and disadvantages of quicksort?
• Which applications are not suitable for quicksort and why?
48. خان سنور Algorithm Analysis
Characteristics
• Advantage
• One of the fastest algorithms on average.
• Does not need additional memory (the sorting takes place in the array -
this is called in-place processing). Compare with mergesort: mergesort
needs additional memory for merging.
• Disadvantage
• The worst-case complexity is O(N2)
49. خان سنور Algorithm Analysis
Applications
• Commercial applications use Quicksort - generally it runs fast, no
additional memory,
this compensates for the rare occasions when it runs with O(N2)
• Never use in applications which require guaranteed response time:
• Life-critical (medical monitoring, life support in aircraft and space craft)
• Mission-critical (monitoring and control in industrial and research plants
handling dangerous materials, control for aircraft, defense, etc)
• unless you assume the worst-case response time.
50. خان سنور Algorithm Analysis
Comparison
• Comparison with heapsort:
• Both algorithms have O(NlogN) complexity
• Quicksort runs faster, (does not support a heap tree)
• The speed of quick sort is not guaranteed
• Comparison with mergesort:
• Mergesort guarantees O(NlogN) time, however it requires additional memory with
size N.
• Quicksort does not require additional memory, however the speed is not
quaranteed
• Usually mergesort is not used for main memory sorting, only for external memory
sorting.
• So far, our best sorting algorithm has O(nlog n) performance: can we do any better?
• In general, the answer is no.
51. خان سنور Algorithm Analysis
Acknowledgment
• Introduction to Algorithms. T.H. Cormen, C.E. Leiserson and R.L. Rivest.
2001. 2001.