3. 3
Insertion Sort
• Adding a new element to a sorted list will keep the list sorted
if the element is inserted in the correct place
• A single element list is sorted
• Inserting a second element in the proper place keeps the list
sorted
• This is repeated until all the elements have been inserted into
the sorted part of the list
8. 88
One step of insertion sort
3 4 7 12 14 14 20 21 33 38 10 55 9 23 28 16
sorted next to be inserted
3 4 7 55 9 23 28 16
10
temp
3833212014141210
sorted
less than 10
9. Insertion Sort Algorithm
for i = 2 to N do
newElement = list[ i ]
location = i - 1
while (location ≥ 1) and (list[location] > newElement)
do
list[location + 1] = list[location]
// shift list[location] one position to the right
location = location - 1
end while
list[ location + 1 ] = newElement
end for
Note: This algorithm does not put the value being
inserted back into the list until its
correct position is found
9
10. 10
Worst-Case Analysis
(This happens when the original list is in decreasing order)
• The outer loop is always done N – 1 times
• The inner loop does the most work when the next
element is smaller than all of the past elements
• On each pass, the next element is compared to all earlier
elements, giving:
Array index starts with 1 Array index starts with 0
)(O
2
*)1(
)1()(W 2
1
12
N
NN
kiN
N
k
N
i
=
−
==−= ∑∑
−
==
11. 11
Average-Case Analysis
• There are (i + 1) places where the i th
element can be added
(Note: This is true only if the array index starts with 0, instead of 1)
• If it goes in the last location, we do one comparison
• If it goes in the second last location, we do two comparisons
• If it goes in the first or second location, we do i comparisons
Comparison: (list[location] > newElement)
13. 13
Average-Case Analysis
(Assuming the index i starts with 0)
• The average number of comparisons to insert the ith
element is:
• We now apply this for each of the algorithm’s passes:
1ln1
11
1
1
:Note
12
1
1
−≈−==
+
∑∑∑ ==
−
=
N
kki
N
k
N
k
N
i 13
1
1
1
21
1
1 +
−+=
∑+
+
=
= i
i
pi
i
A
i
p
i
( )2
21
1
1
1
1
1
O
41
1
)1(
4
)1(
)
1
1
1
2
()(A
N
N
i
N
NN
i
i
AN
N
i
N
i
N
i
i
=≈
+
−−+
−
=
+
−+==
∑
∑∑
−
=
−
=
−
=
(1 + 2 + … + i + i) / (i + 1)
14. Insertion Sort: Analysis
• Running time analysis:
– Worst case and average case :O(N2
)
– Best case: O(N) ?????
15. 15
Bubble Sort
• If we compare pairs of adjacent elements and none
are out of order, the list is sorted
• If any are out of order, we must have to swap them
to get an ordered list
• Bubble sort will make passes though the list
swapping any adjacent elements that are out of
order
16. 16
Bubble Sort
• After the first pass, we know that the largest
element must be in the correct place
• After the second pass, we know that the second
largest element must be in the correct place
• Because of this, we can shorten each successive
pass of the comparison loop
16
18. 18
Bubble Sort Algorithm
numberOfPairs = N
swappedElements = true
while (swappedElements) do
numberOfPairs = numberOfPairs - 1
swappedElements = false
for i = 1 to numberOfPairs do
if (list[ i ] > list[ i + 1 ]) then
Swap( list[i], list[i + 1] )
swappedElements = true
end if
end for
end while
18
19. 19
Best-Case Analysis
• If the elements start in sorted order, the for loop
will compare the adjacent pairs but not make any
changes
• So the swappedElements variable will still be
false and the while loop is only done once
• There are N – 1 comparisons in the best case
• Complexity is O(N)
20. 20
Worst-Case Analysis
• In the worst case the while loop must be done as many
times as possible. This happens when the data set is in
the reverse order.
• Each pass of the for loop must make at least one swap of
the elements
• The number of comparisons will be:
20
( )2
1
1
1
1
1
1
O
2
*)1(
)()(W N
NN
ikiNN
N
iNk
N
i
=
−
===−= ∑∑∑
−
=−=
−
=
21. 21
Average-Case Analysis
• On the first pass, we do N – 1 comparisons
• On the second pass, we do N – 2 comparisons
• On the i-th pass, we do N – i comparisons
• The number of comparisons in the first i passes, in other
words C(i), is given by:
21
2
*)(C
2
1
ii
iNki
iN
Nk
+
−== ∑
−
−=
22. 22
Average-Case Analysis
• We can potentially stop after any of the (at most) N – 1
passes of the for loop
• This means that we have N – 1 possibilities and the
average case is given by
22
∑
−
=
−
=
1
1
)(C
1
1
)(A
N
i
i
N
N
23. 23
Average-Case Analysis
• Putting the equation for C(i) into the equation for
A(N) we get:
23
)O(
6
2
2
*
1
1
)(A
2
2
1
1
2
N
NN
ii
iN
N
N
N
i
=
−
=
∑
+
−
−
=
−
=
24. 24
Selection sort
• Given an array of length n,
– Search elements 0 through n-1 and select the
smallest
• Swap it with the element in location 0
– Search elements 1 through n-1 and select the
smallest
• Swap it with the element in location1
– Search elements 2 through n-1 and select the
smallest
• Swap it with the element in location 2
– Search elements 3 through n-1 and select the
smallest
• Swap it with the element in location 3
25. 25
Example and analysis of selection sort
• The selection sort might swap an
array element with itself--this is
harmless, and not worth checking
for
• Analysis:
– The outer loop executes n-1 times
– The inner loop executes about n/2
times on average (from n to 2 times)
– Work done in the inner loop is
constant (swap two array elements)
– Time required is roughly (n-
1)*(n/2)
– You should recognize this as O(n2
)
7 2 8 5 4
2 7 8 5 4
2 4 8 5 7
2 4 5 8 7
2 4 5 7 8
26. 26
Code for selection sort
selectionSort(int[] a)
{
int outer, inner, min;
for (outer = 0; outer < a.length - 1; outer++)
{ // outer counts down
min = outer;
for (inner = outer + 1; inner < a.length; inner++)
{
if (a[inner] < a[min])
{
min = inner;
}
// Invariant: for all i, if outer <= i <= inner, then a[min] <= a[i]
}
// a[min] is least among a[outer]..a[a.length - 1]
int temp = a[outer];
a[outer] = a[min];
a[min] = temp;
// Invariant: for all i <= outer, if i < j then a[i] <= a[j]
}
}
64. 64
Radix Sort
• This sort is unusual because it does not directly
compare any of the elements
• We instead create a set of buckets and repeatedly
separate the elements into the buckets
• On each pass, we look at a different part of the
elements
64
65. 65
Radix Sort
• Assuming decimal elements and 10 buckets, we
would put the elements into the bucket
associated with its units digit
• The buckets are actually queues so the elements
are added at the end of the bucket
• At the end of the pass, the buckets are combined
in increasing order
65
66. 66
Radix Sort
• On the second pass, we separate the elements
based on the “tens” digit, and on the third pass
we separate them based on the “hundreds” digit
• Each pass must make sure to process the
elements in order and to put the buckets back
together in the correct order
66
68. 68
Radix Sort Example (continued)
68
The unit digits are already in order
Now start sorting the tens digit
69. 69
Radix Sort Example (continued)
Values in the buckets are now in order
69
The unit and tens digits are already in order
Now start sorting the hundreds digit
70. 70
The Algorithm to sort a set of numeric keys
shift = 1
for pass = 1 to keySize do
for entry = 1 to N do
bucketNumber = (list[entry] / shift) mod 10
Append( bucket[bucketNumber], list[entry] )
end for
list = CombineBuckets()
shift = shift * 10
end for
70
quotient remainder
# of digits of the longest key
# of elemnts in the list
bucketNumber: lies between 0 and 9
71. 71
Radix Sort Analysis
• Each element is examined once for each of the digits it
contains, so if the elements have at most M digits and
there are N elements this algorithm has order O(M*N)
• This means that sorting is linear based on the number of
elements
• Why then isn’t this the only sorting algorithm used?
71
72. 72
Radix Sort Analysis
• Though this is a very time efficient algorithm it is not
space efficient
• If an array is used for the buckets and we have B buckets,
we would need N*B extra memory locations because it’s
possible for all of the elements to wind up in one bucket
• If linked lists are used for the buckets you have the
overhead of pointers
72
99. 99
Merge-Sort Summary
Approach: divide and conquer
Time
– Most of the work is in the merging
– Total time: O(n log n)
Space:
– O(n), more space than other sorts.
99
100. 100
Quick Sort
• Divide:
• Pick any element p as the pivot, e.g, the first element
• Partition the remaining elements into
FirstPart, which contains all elements < p
SecondPart, which contains all elements ≥ p
• Recursively sort the FirstPart and SecondPart
• Combine: no work is necessary since sorting
is done in place
100
101. 101
Quick Sort
101
x < p p p ≤ x
Partition
FirstPart SecondPart
p
pivot
A:
Recursive call
x < p p p ≤ x
Sorted
FirstPart
Sorted
SecondPart
SortedSorted
102. 102
Quick Sort
Quick-Sort(A, left, right)
if left ≥ right return
else
middle ← Partition(A, left, right)
Quick-Sort(A, left, middle–1 )
Quick-Sort(A, middle+1, right)
end if
102
117. 117
Partition(A, left, right)
1. x ← A[left]
2. i ← left
3. for j ← left+1 to right
4. if A[j] < x then
5. i ← i + 1
6. swap(A[i], A[j])
7. end if
8. end for j
9. swap(A[i], A[left])
10. return i
117
133. 133
Quick-Sort: an Average Case
• Suppose the split is 1/10 : 9/10
133
cn
cn
cn
≤cn
n
0.1n 0.9n
0.01n 0.09n 0.09n
Total time: O(nlogn)
0.81n
2
2
log
10
n
log
10/9
n
≤cn
134. 134
Quick-Sort Summary
• Time
– Most of the work done in partitioning.
– Average case takes O(n log(n)) time.
– Worst case takes )(n2
) time
Space
– Sorts in-place, i.e., does not require additional
space
134
135. 135
Shellsort
• We can look at the list as a set of interleaved sublists
• For example, the elements in the even locations could be
one list and the elements in the odd locations the other
list
• Shellsort begins by sorting many small lists, and increases
their size and decreases their number as it continues
135
136. 136
Shellsort
• One technique is to use decreasing powers of 2, so
that if the list has 64 elements, the first pass would
use 32 lists of 2 elements, the second pass would use
16 lists of 4 elements, and so on
• These lists would be sorted with an insertion sort
137. 40 2 1 43 3 65 0 -1 58 3 42 4
Original:
5-sort: Sort items with distance 5 element:
40 2 1 43 3 65 0 -1 58 3 42 4
Shell Sort: Idea
Donald Shell (1959): Exchange items that are far apart!
139. Shellsort Algorithm
passes = lg N
while (passes ≥ 1) do
increment = 2passes
- 1
for start = 1 to increment do
InsertionSort(list, N, start, increment)
end for
passes = passes - 1
end while
139
N=15
Pass 1: increment = 7, 7 calls, size = 2
Pass 2: increment = 3, 3 calls, size = 5
Pass 3: increment = 1, 1 call, size = 15
140. Shell Sort: Gap Values
• Gap: the distance between items being
sorted.
• As we progress, the gap decreases. Shell
Sort is also called Diminishing Gap Sort.
• Shell proposed starting gap of N/2, halving
at each step.
• There are many ways of choosing the next
gap.
141. 141
Shellsort Analysis
• The set of increments used has a major impact on the
efficiency of shellsort
• With a set of increments that are one less than powers of
2, as in the algorithm given, the worst-case has been
shown to be O(N3/2
)
• An order of O(N5/3
) can be achieved with just 2 passes with
increments of and 1
141
3*72.1 N
Pass 1 Pass 2
142. 142
Shellsort Analysis
• An order of O(N3/2
) can be achieved with a set of
increments less than N that satisfy the equation:
… h(3) = 13, h(2) = 4, h(1) = 1
h(j+1) = 3 h(j) + 1, with h(1) = 1
• Using all possible values of 2i
3j
(in decreasing order) that
are less than N will produce an order of
O(N(lg N)2
)
142
213
−= j
jh
143. Assignment 7
1.Sorting an array of size 20 which is randomly
generated by using all the sorting algorithms.
143