SlideShare una empresa de Scribd logo
1 de 3
Descargar para leer sin conexión
Here is an analysis of the time complexity of quick-sort in detail.

In quick sort, we pick an element called the pivot in each step and re-arrange the array in such a
way that all elements less than the pivot now appear to the left of the pivot, and all elements larger
than the pivot appear on the right side of the pivot. In all subsequent iterations of the sorting
algorithm, the position of this pivot will remain unchanged, because it has been put in its correct
place. The total time taken to re-arrange the array as just described, is always O(n)1 , or αn where
α is some constant [see footnote]. Let us suppose that the pivot we just chose has divided the
array into two parts - one of size k and the other of size n − k. Notice that both these parts still
need to be sorted. This gives us the following relation:

T (n) = T (k) + T (n − k) + αn

where T (n) refers to the time taken by the algorithm to sort n elements.

WORST CASE ANALYSIS:

Now consider the case, when the pivot happened to be the least element of the array, so that
we had k = 1 and n − k = n − 1. In such a case, we have:

T (n) = T (1) + T (n − 1) + αn

Now let us analyse the time complexity of quick sort in such a case in detail by solving the recur-
rence as follows:

T (n) = T (n − 1) + T (1) + αn

= [T (n − 2) + T (1) + α(n − 1)] + T (1) + αn

(Note: I have written T (n − 1) = T (1) + T (n − 2) + α(n − 1) by just substituting n − 1 instead
of n. Note the implicit assumption that the pivot that was chosen divided the original subarray of
size n − 1 into two parts: one of size n − 2 and the other of size 1.)

= T (n − 2) + 2T (1) + α(n − 1 + n) (by simplifying and grouping terms together)

= [T (n − 3) + T (1) + α(n − 2)] + 2T (1) + α(n − 1 + n)

= T (n − 3) + 3T (1) + α(n − 2 + n − 1 + n)

= [T (n − 4) + T (1) + α(n − 3)] + 3T (1) + α(n − 2 + n − 1 + n)

   1
    Here is how we can perform the partitioning: (1) Traverse the original array once and copy all the elements less
than the pivot into a temporary array, (2) Copy the pivot into the temporary array, (3) Copy all elements greater
than the pivot into the temporary array, and (4) Transfer the contents of the temporary array into the original array,
thus overwriting the original array. This takes a total of 4n operations. There do exist ways to perform such a
partitioning in-place, i.e. without creating a temporary array, as well. These techniques do speed up quicksort by a
constant factor, but they are not important for understanding the overall quicksort algorithm.


                                                          1
= T (n − 4) + 4T (1) + α(n − 3 + n − 2 + n − 1 + n)

= T (n − i) + iT (1) + α(n − i + 1 + ..... + n − 2 + n − 1 + n) (Continuing likewise till the ith
step.)

                            i−1
= T (n − i) + iT (1) + α(   j=0 (n   − j)) (Look carefully at how the summation is being written.)

Now clearly such a recurrence can only go on until i = n − 1 (Why? because otherwise n − i
would be less than 1). So, substitute i = n − 1 in the above equation, which gives us:

                                      n−2
T (n) = T (1) + (n − 1)T (1) + α      j=0 (n   − j)

                                                                 n−2         n−2
= nT (1) + α(n(n − 2) − (n − 2)(n − 1)/2) (Notice that           j=0 j   =   j=1 j   = (n − 2)(n − 1)/2
by a formula we earlier derived in class)

which is O(n2 ).

This is the worst case of quick-sort, which happens when the pivot we picked turns out to be
the least element of the array to be sorted, in every step (i.e. in every recursive call). A similar
situation will also occur if the pivot happens to be the largest element of the array to be sorted.

BEST CASE ANALYSIS:

The best case of quicksort occurs when the pivot we pick happens to divide the array into two
exactly equal parts, in every step. Thus we have k = n/2 and n − k = n/2 for the original array of
size n.

Consider, therefore, the recurrence:

T (n) = 2T (n/2) + αn

= 2(2T (n/4) + αn/2) + αn

(Note: I have written T (n/2) = 2T (n/4) + αn/2 by just substituting n/2 for n in the equa-
tion T (n) = 2T (n/2) + αn.)

= 22 T (n/4) + 2αn (By simplifying and grouping terms together).

= 22 (2T (n/8) + αn/4) + 2αn

= 23 T (n/8) + 3αn

= 2k T (n/2k ) + kαn (Continuing likewise till the k th step)

Notice that this recurrence will continue only until n = 2k (otherwise we have n/2k < 1), i.e.
until k = log n. Thus, by putting k = log n, we have the following equation:


                                                      2
T (n) = nT (1) + αn log n, which is O(n log n).

This is the best case for quicksort.

It also turns out that in the average case (over all possible pivot configurations), quicksort has
a time complexity of O(n log n), the proof of which is beyond the scope of our class.

AVOIDING THE WORST CASE

Practical implementations of quicksort often pick a pivot randomly each time. This greatly re-
duces the chance that the worst-case ever occurs. This method is seen to work excellently in
practice. The other technique, which determinstically prevents the worst case from ever occurring,
is to find the median of the array to be sorted each time, and use that as the pivot. The median
can (surprisingly!) be found in linear time (the details of which are, again, beyond the scope of
our course) but that is saddled with a huge constant factor overhead, rendering it suboptimal for
practical implementations.




                                                  3

Más contenido relacionado

La actualidad más candente

Stochastic Control and Information Theoretic Dualities (Complete Version)
Stochastic Control and Information Theoretic Dualities (Complete Version)Stochastic Control and Information Theoretic Dualities (Complete Version)
Stochastic Control and Information Theoretic Dualities (Complete Version)
Haruki Nishimura
 
Es400 fall 2012_lecuture_2_transformation_of_continuous_time_signal.pptx
Es400 fall 2012_lecuture_2_transformation_of_continuous_time_signal.pptxEs400 fall 2012_lecuture_2_transformation_of_continuous_time_signal.pptx
Es400 fall 2012_lecuture_2_transformation_of_continuous_time_signal.pptx
umavijay
 

La actualidad más candente (20)

Stochastic Control and Information Theoretic Dualities (Complete Version)
Stochastic Control and Information Theoretic Dualities (Complete Version)Stochastic Control and Information Theoretic Dualities (Complete Version)
Stochastic Control and Information Theoretic Dualities (Complete Version)
 
Neutral Electronic Excitations: a Many-body approach to the optical absorptio...
Neutral Electronic Excitations: a Many-body approach to the optical absorptio...Neutral Electronic Excitations: a Many-body approach to the optical absorptio...
Neutral Electronic Excitations: a Many-body approach to the optical absorptio...
 
Linear response theory
Linear response theoryLinear response theory
Linear response theory
 
Linear response theory and TDDFT
Linear response theory and TDDFT Linear response theory and TDDFT
Linear response theory and TDDFT
 
Vcla.ppt COMPOSITION OF LINEAR TRANSFORMATION KERNEL AND RANGE OF LINEAR TR...
Vcla.ppt COMPOSITION OF LINEAR TRANSFORMATION   KERNEL AND RANGE OF LINEAR TR...Vcla.ppt COMPOSITION OF LINEAR TRANSFORMATION   KERNEL AND RANGE OF LINEAR TR...
Vcla.ppt COMPOSITION OF LINEAR TRANSFORMATION KERNEL AND RANGE OF LINEAR TR...
 
Es400 fall 2012_lecuture_2_transformation_of_continuous_time_signal.pptx
Es400 fall 2012_lecuture_2_transformation_of_continuous_time_signal.pptxEs400 fall 2012_lecuture_2_transformation_of_continuous_time_signal.pptx
Es400 fall 2012_lecuture_2_transformation_of_continuous_time_signal.pptx
 
time response
time responsetime response
time response
 
Lecture 2 sapienza 2017
Lecture 2 sapienza 2017Lecture 2 sapienza 2017
Lecture 2 sapienza 2017
 
Unit step function
Unit step functionUnit step function
Unit step function
 
Laplace transform of periodic functions
Laplace transform of periodic functionsLaplace transform of periodic functions
Laplace transform of periodic functions
 
Properties of fourier transform
Properties of fourier transformProperties of fourier transform
Properties of fourier transform
 
Unit 5: All
Unit 5: AllUnit 5: All
Unit 5: All
 
Algorithms Lab PPT
Algorithms Lab PPTAlgorithms Lab PPT
Algorithms Lab PPT
 
Lecture 1 sapienza 2017
Lecture 1 sapienza 2017Lecture 1 sapienza 2017
Lecture 1 sapienza 2017
 
Master Thesis on Rotating Cryostats and FFT, DRAFT VERSION
Master Thesis on Rotating Cryostats and FFT, DRAFT VERSIONMaster Thesis on Rotating Cryostats and FFT, DRAFT VERSION
Master Thesis on Rotating Cryostats and FFT, DRAFT VERSION
 
Lecture 3 sapienza 2017
Lecture 3 sapienza 2017Lecture 3 sapienza 2017
Lecture 3 sapienza 2017
 
Application of Convolution Theorem
Application of Convolution TheoremApplication of Convolution Theorem
Application of Convolution Theorem
 
Laplace Transform of Periodic Function
Laplace Transform of Periodic FunctionLaplace Transform of Periodic Function
Laplace Transform of Periodic Function
 
linear tranformation- VC&LA
linear tranformation- VC&LAlinear tranformation- VC&LA
linear tranformation- VC&LA
 
signal and system Dirac delta functions (1)
signal and system Dirac delta functions (1)signal and system Dirac delta functions (1)
signal and system Dirac delta functions (1)
 

Destacado

Quick sort algo analysis
Quick sort algo analysisQuick sort algo analysis
Quick sort algo analysis
Nargis Ehsan
 

Destacado (7)

Quick sort algo analysis
Quick sort algo analysisQuick sort algo analysis
Quick sort algo analysis
 
International Journal of Microelectronics and Digital Integrated Circuits (Vo...
International Journal of Microelectronics and Digital Integrated Circuits (Vo...International Journal of Microelectronics and Digital Integrated Circuits (Vo...
International Journal of Microelectronics and Digital Integrated Circuits (Vo...
 
International Journal of VLSI Design and Technology (Vol 2 Issue 2)
International Journal of VLSI Design and Technology (Vol 2 Issue 2)International Journal of VLSI Design and Technology (Vol 2 Issue 2)
International Journal of VLSI Design and Technology (Vol 2 Issue 2)
 
04 - 15 Jan - Heap Sort
04 - 15 Jan - Heap Sort04 - 15 Jan - Heap Sort
04 - 15 Jan - Heap Sort
 
Quicksort
QuicksortQuicksort
Quicksort
 
Quick sort Algorithm Discussion And Analysis
Quick sort Algorithm Discussion And AnalysisQuick sort Algorithm Discussion And Analysis
Quick sort Algorithm Discussion And Analysis
 
Algorithm: Quick-Sort
Algorithm: Quick-SortAlgorithm: Quick-Sort
Algorithm: Quick-Sort
 

Similar a Quicksort analysis

Skiena algorithm 2007 lecture08 quicksort
Skiena algorithm 2007 lecture08 quicksortSkiena algorithm 2007 lecture08 quicksort
Skiena algorithm 2007 lecture08 quicksort
zukun
 
T2311 - Ch 4_Part1.pptx
T2311 - Ch 4_Part1.pptxT2311 - Ch 4_Part1.pptx
T2311 - Ch 4_Part1.pptx
GadaFarhan
 
Algorithm Design and Complexity - Course 3
Algorithm Design and Complexity - Course 3Algorithm Design and Complexity - Course 3
Algorithm Design and Complexity - Course 3
Traian Rebedea
 
Conjugate Gradient Methods
Conjugate Gradient MethodsConjugate Gradient Methods
Conjugate Gradient Methods
MTiti1
 
Find the compact trigonometric Fourier series for the periodic signal.pdf
Find the compact trigonometric Fourier series for the periodic signal.pdfFind the compact trigonometric Fourier series for the periodic signal.pdf
Find the compact trigonometric Fourier series for the periodic signal.pdf
arihantelectronics
 
CS330-Lectures Statistics And Probability
CS330-Lectures Statistics And ProbabilityCS330-Lectures Statistics And Probability
CS330-Lectures Statistics And Probability
bryan111472
 

Similar a Quicksort analysis (20)

Skiena algorithm 2007 lecture08 quicksort
Skiena algorithm 2007 lecture08 quicksortSkiena algorithm 2007 lecture08 quicksort
Skiena algorithm 2007 lecture08 quicksort
 
T2311 - Ch 4_Part1.pptx
T2311 - Ch 4_Part1.pptxT2311 - Ch 4_Part1.pptx
T2311 - Ch 4_Part1.pptx
 
Divide and conquer
Divide and conquerDivide and conquer
Divide and conquer
 
Algorithm Design and Complexity - Course 3
Algorithm Design and Complexity - Course 3Algorithm Design and Complexity - Course 3
Algorithm Design and Complexity - Course 3
 
pradeepbishtLecture13 div conq
pradeepbishtLecture13 div conqpradeepbishtLecture13 div conq
pradeepbishtLecture13 div conq
 
Merge sort and quick sort
Merge sort and quick sortMerge sort and quick sort
Merge sort and quick sort
 
Solving recurrences
Solving recurrencesSolving recurrences
Solving recurrences
 
Time complexity
Time complexityTime complexity
Time complexity
 
2.pptx
2.pptx2.pptx
2.pptx
 
CSE680-07QuickSort.pptx
CSE680-07QuickSort.pptxCSE680-07QuickSort.pptx
CSE680-07QuickSort.pptx
 
Lecture 4 (1).pptx
Lecture 4 (1).pptxLecture 4 (1).pptx
Lecture 4 (1).pptx
 
Conjugate Gradient Methods
Conjugate Gradient MethodsConjugate Gradient Methods
Conjugate Gradient Methods
 
3.pdf
3.pdf3.pdf
3.pdf
 
Copy of y16 02-2119divide-and-conquer
Copy of y16 02-2119divide-and-conquerCopy of y16 02-2119divide-and-conquer
Copy of y16 02-2119divide-and-conquer
 
Recurrence relation
Recurrence relationRecurrence relation
Recurrence relation
 
Merge Sort
Merge SortMerge Sort
Merge Sort
 
Best,worst,average case .17581556 045
Best,worst,average case .17581556 045Best,worst,average case .17581556 045
Best,worst,average case .17581556 045
 
Find the compact trigonometric Fourier series for the periodic signal.pdf
Find the compact trigonometric Fourier series for the periodic signal.pdfFind the compact trigonometric Fourier series for the periodic signal.pdf
Find the compact trigonometric Fourier series for the periodic signal.pdf
 
Analysis and design of algorithms part2
Analysis and design of algorithms part2Analysis and design of algorithms part2
Analysis and design of algorithms part2
 
CS330-Lectures Statistics And Probability
CS330-Lectures Statistics And ProbabilityCS330-Lectures Statistics And Probability
CS330-Lectures Statistics And Probability
 

Quicksort analysis

  • 1. Here is an analysis of the time complexity of quick-sort in detail. In quick sort, we pick an element called the pivot in each step and re-arrange the array in such a way that all elements less than the pivot now appear to the left of the pivot, and all elements larger than the pivot appear on the right side of the pivot. In all subsequent iterations of the sorting algorithm, the position of this pivot will remain unchanged, because it has been put in its correct place. The total time taken to re-arrange the array as just described, is always O(n)1 , or αn where α is some constant [see footnote]. Let us suppose that the pivot we just chose has divided the array into two parts - one of size k and the other of size n − k. Notice that both these parts still need to be sorted. This gives us the following relation: T (n) = T (k) + T (n − k) + αn where T (n) refers to the time taken by the algorithm to sort n elements. WORST CASE ANALYSIS: Now consider the case, when the pivot happened to be the least element of the array, so that we had k = 1 and n − k = n − 1. In such a case, we have: T (n) = T (1) + T (n − 1) + αn Now let us analyse the time complexity of quick sort in such a case in detail by solving the recur- rence as follows: T (n) = T (n − 1) + T (1) + αn = [T (n − 2) + T (1) + α(n − 1)] + T (1) + αn (Note: I have written T (n − 1) = T (1) + T (n − 2) + α(n − 1) by just substituting n − 1 instead of n. Note the implicit assumption that the pivot that was chosen divided the original subarray of size n − 1 into two parts: one of size n − 2 and the other of size 1.) = T (n − 2) + 2T (1) + α(n − 1 + n) (by simplifying and grouping terms together) = [T (n − 3) + T (1) + α(n − 2)] + 2T (1) + α(n − 1 + n) = T (n − 3) + 3T (1) + α(n − 2 + n − 1 + n) = [T (n − 4) + T (1) + α(n − 3)] + 3T (1) + α(n − 2 + n − 1 + n) 1 Here is how we can perform the partitioning: (1) Traverse the original array once and copy all the elements less than the pivot into a temporary array, (2) Copy the pivot into the temporary array, (3) Copy all elements greater than the pivot into the temporary array, and (4) Transfer the contents of the temporary array into the original array, thus overwriting the original array. This takes a total of 4n operations. There do exist ways to perform such a partitioning in-place, i.e. without creating a temporary array, as well. These techniques do speed up quicksort by a constant factor, but they are not important for understanding the overall quicksort algorithm. 1
  • 2. = T (n − 4) + 4T (1) + α(n − 3 + n − 2 + n − 1 + n) = T (n − i) + iT (1) + α(n − i + 1 + ..... + n − 2 + n − 1 + n) (Continuing likewise till the ith step.) i−1 = T (n − i) + iT (1) + α( j=0 (n − j)) (Look carefully at how the summation is being written.) Now clearly such a recurrence can only go on until i = n − 1 (Why? because otherwise n − i would be less than 1). So, substitute i = n − 1 in the above equation, which gives us: n−2 T (n) = T (1) + (n − 1)T (1) + α j=0 (n − j) n−2 n−2 = nT (1) + α(n(n − 2) − (n − 2)(n − 1)/2) (Notice that j=0 j = j=1 j = (n − 2)(n − 1)/2 by a formula we earlier derived in class) which is O(n2 ). This is the worst case of quick-sort, which happens when the pivot we picked turns out to be the least element of the array to be sorted, in every step (i.e. in every recursive call). A similar situation will also occur if the pivot happens to be the largest element of the array to be sorted. BEST CASE ANALYSIS: The best case of quicksort occurs when the pivot we pick happens to divide the array into two exactly equal parts, in every step. Thus we have k = n/2 and n − k = n/2 for the original array of size n. Consider, therefore, the recurrence: T (n) = 2T (n/2) + αn = 2(2T (n/4) + αn/2) + αn (Note: I have written T (n/2) = 2T (n/4) + αn/2 by just substituting n/2 for n in the equa- tion T (n) = 2T (n/2) + αn.) = 22 T (n/4) + 2αn (By simplifying and grouping terms together). = 22 (2T (n/8) + αn/4) + 2αn = 23 T (n/8) + 3αn = 2k T (n/2k ) + kαn (Continuing likewise till the k th step) Notice that this recurrence will continue only until n = 2k (otherwise we have n/2k < 1), i.e. until k = log n. Thus, by putting k = log n, we have the following equation: 2
  • 3. T (n) = nT (1) + αn log n, which is O(n log n). This is the best case for quicksort. It also turns out that in the average case (over all possible pivot configurations), quicksort has a time complexity of O(n log n), the proof of which is beyond the scope of our class. AVOIDING THE WORST CASE Practical implementations of quicksort often pick a pivot randomly each time. This greatly re- duces the chance that the worst-case ever occurs. This method is seen to work excellently in practice. The other technique, which determinstically prevents the worst case from ever occurring, is to find the median of the array to be sorted each time, and use that as the pivot. The median can (surprisingly!) be found in linear time (the details of which are, again, beyond the scope of our course) but that is saddled with a huge constant factor overhead, rendering it suboptimal for practical implementations. 3