/
CSCE-221 Sorting Prof.  Lupoli CSCE-221 Sorting Prof.  Lupoli

CSCE-221 Sorting Prof. Lupoli - PowerPoint Presentation

eleanor
eleanor . @eleanor
Follow
65 views
Uploaded On 2023-10-30

CSCE-221 Sorting Prof. Lupoli - PPT Presentation

Zhipei Yan 1032019 From Dr Scott Schafer and notes of Prof Lupoli httpfacultycstamueduschaeferteaching221Fall2018LecturesSortingppt Outline Bubble Sort Insertion Sort Merge Sort ID: 1027086

bubble sort merge sequence sort bubble sequence merge size elements input partition execution quick sorted data call recursive time

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "CSCE-221 Sorting Prof. Lupoli" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

1. CSCE-221SortingProf. LupoliZhipei Yan10/3/2019From Dr. Scott Schafer and notes of Prof. Lupolihttp://faculty.cs.tamu.edu/schaefer/teaching/221_Fall2018/Lectures/Sorting.ppt

2. OutlineBubble SortInsertion SortMerge SortQuick SortHeap sort, Radix Sort, Bucket SortAll Sort Examples assume ascending order235692152942

3. Bubble Sort

4. Video & Short inLab exercisehttps://youtu.be/3CeuJQyFI7APLEASE ASK ANY QUESTIONS DURING THE VIDEO!!

5. Complexity of Bubble SortTwo loops each proportional to n T(n) = O(n2)It is possible to speed up bubble sort using the last index swapped, but doesn’t affect worst case complexityAlgorithm bubbleSort(S, C) Input sequence S, comparator C Output sequence S sorted according to Cfor ( i = 0; i < S.size(); i++ ) for ( j = 1; j < S.size() – i; j++ ) if ( S.atIndex ( j – 1 ) > S.atIndex ( j ) ) S.swap ( j-1, j );return(S)

6. Insertion Sort Algorithmstarts from the smallest, to the largest.Left most end of the array is sorted Looks at the next element (key), and places it in the correct ordered spot.Need to move elements to make space for keyremember as a “deck of cards”, each one you draw you immediately put in orderagain n-1 passes to complete

7. Video & Short inLab exercisehttps://youtu.be/y0_LvwFAuT8

8. Insertion SortSource : http://www.cs.sfu.ca/CourseCentral/225/jmanuch/lec/6-2.ppt

9. 121215163513

10. Complexity of Insertion SortTwo nested loops - dependent Top loop runs n time Bottom loop move elements to the right if they are > than keyTime Complexity T(n) = O(n2)Algorithm Insertion(S, C) Input sequence S, comparator C Output sequence S sorted according to Cfor ( i = 1; i < S.size(); i++ ) { key = S[i]; for ( j = i - 1; j > 0 && S[j] > key; j-- ) { S[j + 1] = S[j]; } S[j + 1] = key;}return S;

11. Merge Sort7 2  9 4  2 4 7 97  2  2 79  4  4 97  72  29  94  4

12. Merge SortMerge sort is based on the divide-and-conquer paradigm. It consists of three steps:Divide: partition input sequence S into two sequences S1 and S2 of about n/2 elements eachRecur: recursively sort S1 and S2Conquer: merge S1 and S2 into a unique sorted sequenceAlgorithm mergeSort(S, C) Input sequence S, comparator C Output sequence S sorted according to Cif S.size() > 1 { (S1, S2) := partition(S, S.size()/2) S1 := mergeSort(S1, C) S2 := mergeSort(S2, C) S := merge(S1, S2)} return(S)

13. D&C algorithm analysis with recurrence equationsDivide-and conquer is a general algorithm design paradigm:Divide: divide the input data S into k (disjoint) subsets S1, S2, …, SkRecur: solve the sub-problems associated with S1, S2, …, SkConquer: combine the solutions for S1, S2, …, Sk into a solution for SThe base case for the recursion are sub-problems of constant sizeAnalysis can be done using recurrence equations (relations)When the size of all sub-problems is the same (frequently the case) the recurrence equation representing the algorithm is:T(n) = D(n) + k T(n/c) + C(n)Where D(n) is the cost of dividing S into the k subproblems, S1, S2, S3, …., SkThere are k subproblems, each of size n/c that will be solved recursivelyC(n) is the cost of combining the subproblem solutions to get the solution for S

14. Now, back to mergesort…The running time of Merge Sort can be expressed by the recurrence equation:T(n) = 2T(n/2) + M(n)We need to determine M(n), the time to merge two sorted sequences each of size n/2.Algorithm mergeSort(S, C) Input sequence S, comparator C Output sequence S sorted according to Cif S.size() > 1 { (S1, S2) := partition(S, S.size()/2) S1 := mergeSort(S1, C) S2 := mergeSort(S2, C) S := merge(S1, S2)} return(S)

15. Merging Two Sorted SequencesThe conquer step of merge-sort consists of merging two sorted sequences A and B into a sorted sequence S containing the union of the elements of A and BMerging two sorted sequences, each with n/2 elements and implemented by means of a doubly linked list, takes O(n) timeM(n) = O(n)Algorithm merge(A, B) Input sequences A and B with n/2 elements each Output sorted sequence of A  BS  empty sequencewhile A.isEmpty()  B.isEmpty() if A.first() < B.first() S.insertLast(A.removeFirst()) else S.insertLast(B.removeFirst())while A.isEmpty() S.insertLast(A.removeFirst())while B.isEmpty() S.insertLast(B.removeFirst())return S

16. And the complexity of mergesort…So, the running time of Merge Sort can be expressed by the recurrence equation:T(n) = 2T(n/2) + M(n) = 2T(n/2) + O(n) = O(nlogn)Algorithm mergeSort(S, C) Input sequence S, comparator C Output sequence S sorted according to Cif S.size() > 1 { (S1, S2) := partition(S, S.size()/2) S1 := mergeSort(S1, C) S2 := mergeSort(S2, C) S := merge(S1, S2)} return(S)

17. Merge Sort Execution Tree (recursive calls)An execution of merge-sort is depicted by a binary treeeach node represents a recursive call of merge-sort and storesunsorted sequence before the execution and its partitionsorted sequence at the end of the executionthe root is the initial call the leaves are calls on subsequences of size 0 or 17 2  9 4  2 4 7 97  2  2 79  4  4 97  72  29  94  4

18. Execution ExamplePartition7 2 9 4  2 4 7 93 8 6 1  1 3 6 87 2  2 79 4  4 93 8  3 86 1  1 67  72  29  94  43  38  86  61  17 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9

19. Execution Example (cont.)Recursive call, partition 7 2  9 4  2 4 7 93 8 6 1  1 3 6 87 2  2 79 4  4 93 8  3 86 1  1 67  72  29  94  43  38  86  61  17 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9

20. Execution Example (cont.)Recursive call, partition 7 2  9 4  2 4 7 93 8 6 1  1 3 6 87  2  2 79 4  4 93 8  3 86 1  1 67  72  29  94  43  38  86  61  17 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9

21. Execution Example (cont.)Recursive call, base case 7 2  9 4  2 4 7 93 8 6 1  1 3 6 87  2  2 79 4  4 93 8  3 86 1  1 67  72  29  94  43  38  86  61  17 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9

22. Execution Example (cont.)Recursive call, base case 7 2  9 4  2 4 7 93 8 6 1  1 3 6 87  2  2 79 4  4 93 8  3 86 1  1 67  72  29  94  43  38  86  61  17 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9

23. Execution Example (cont.)Merge 7 2  9 4  2 4 7 93 8 6 1  1 3 6 87  2  2 79 4  4 93 8  3 86 1  1 67  72  29  94  43  38  86  61  17 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9

24. Execution Example (cont.)Recursive call, …, base case, merge 7 2  9 4  2 4 7 93 8 6 1  1 3 6 87  2  2 79 4  4 93 8  3 86 1  1 67  72  23  38  86  61  17 2 9 4  3 8 6 1  1 2 3 4 6 7 8 99  94  4

25. Execution Example (cont.)Merge 7 2  9 4  2 4 7 93 8 6 1  1 3 6 87  2  2 79 4  4 93 8  3 86 1  1 67  72  29  94  43  38  86  61  17 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9

26. Execution Example (cont.)Recursive call, …, merge, merge 7 2  9 4  2 4 7 93 8 6 1  1 3 6 87  2  2 79 4  4 93 8  3 86 1  1 67  72  29  94  43  38  86  61  17 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9

27. Execution Example (cont.)Merge 7 2  9 4  2 4 7 93 8 6 1  1 3 6 87  2  2 79 4  4 93 8  3 86 1  1 67  72  29  94  43  38  86  61  17 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9

28. Analysis of Merge-SortThe height h of the merge-sort tree is O(log n) at each recursive call we divide in half the sequence, The work done at each level is O(n) At level i, we partition and merge 2i sequences of size n/2i Thus, the total running time of merge-sort is O(n log n)depth#seqssizeCost for level01nn12n/2n…i2in/2in………logn2logn = nn/2logn = 1n

29. Summary of Sorting Algorithms (so far)AlgorithmTimeNotesSelection SortO(n2)Slow, in-placeFor small data setsInsertion SortO(n2) WC, ACO(n) BCSlow, in-placeFor small data setsBubble SortO(n2) WC, ACO(n) BCSlow, in-placeFor small data setsHeap SortO(nlog n)Fast, in-placeFor large data setsMerge SortO(nlogn)Fast, sequential data accessFor huge data sets

30. Quick-Sort7 4 9 6 2  2 4 6 7 94 2  2 47 9  7 92  29  9

31. Quick Sorthttps://upload.wikimedia.org/wikipedia/commons/6/6a/Sorting_quicksort_anim.gif

32. Quick-SortQuick-sort is a randomized sorting algorithm based on the divide-and-conquer paradigm:Divide: pick a random element x (called pivot) and partition S into L elements less than xE elements equal xG elements greater than xRecur: sort L and GConquer: join L, E and GxxLGEx

33. Analysis of Quick Sort using Recurrence RelationsAssumption: random pivot expected to give equal sized sublistsThe running time of Quick Sort can be expressed as:T(n) = 2T(n/2) + P(n)T(n) - time to run quicksort() on an input of size nP(n) - time to run partition() on input of size n Algorithm QuickSort(S, l, r) Input sequence S, ranks l and r Output sequence S with the elements of rank between l and r rearranged in increasing order if l  r returni  a random integer between l and r x  S.elemAtRank(i) (h, k)  Partition(x)QuickSort(S, l, h - 1)QuickSort(S, k + 1, r)

34. PartitionWe partition an input sequence as follows:We remove, in turn, each element y from S and We insert y into L, E or G, depending on the result of the comparison with the pivot xEach insertion and removal is at the beginning or at the end of a sequence, and hence takes O(1) timeThus, the partition step of quick-sort takes O(n) timeAlgorithm partition(S, p) Input sequence S, position p of pivot Output subsequences L, E, G of the elements of S less than, equal to, or greater than the pivot, resp. L, E, G  empty sequencesx  S.remove(p) while S.isEmpty() y  S.remove(S.first()) if y < x L.insertLast(y) else if y = x E.insertLast(y) else { y > x } G.insertLast(y)return L, E, G

35. So, the expected complexity of Quick SortAssumption: random pivot expected to give equal sized sublistsThe running time of Quick Sort can be expressed as:T(n) = 2T(n/2) + P(n) = 2T(n/2) + O(n) = O(nlogn)Algorithm QuickSort(S, l, r) Input sequence S, ranks l and r Output sequence S with the elements of rank between l and r rearranged in increasing order if l  r returni  a random integer between l and r x  S.elemAtRank(i) (h, k)  Partition(x)QuickSort(S, l, h - 1)QuickSort(S, k + 1, r)

36. Quick-Sort TreeAn execution of quick-sort is depicted by a binary treeEach node represents a recursive call of quick-sort and storesUnsorted sequence before the execution and its pivotSorted sequence at the end of the executionThe root is the initial call The leaves are calls on subsequences of size 0 or 17 4 9 6 2  2 4 6 7 94 2  2 47 9  7 92  29  9

37. Worst-case Running TimeThe worst case for quick-sort occurs when the pivot is the unique minimum or maximum elementOne of L and G has size n - 1 and the other has size 0The running time is proportional to n + (n - 1) + … + 2 + 1 = O(n2)Alternatively, using recurrence equations, T(n) = T(n-1) + O(n) = O(n2)depthtime0n1n - 1……n - 11…

38. In-Place Quick-SortQuick-sort can be implemented to run in-placeIn the partition step, we use swap operations to rearrange the elements of the input sequence such thatthe elements less than the pivot have rank less than lthe elements greater than or equal to the pivot have rank greater than lThe recursive calls considerelements with rank less than lelements with rank greater than lAlgorithm inPlaceQuickSort(S, a, b) Input sequence S, ranks a and b if a  b returnp S[b]l ar b-1while (l≤r) while(l ≤ r & S[l] ≤ p) l++ while(r  l & p ≤ S[r]) r-- if (l<r) swap(S[l], S[r])swap(S[l], S[b])inPlaceQuickSort(S,a,l-1)inPlaceQuickSort(S,l+1,b)

39. In-Place PartitioningPerform the partition using two indices to split S into L and E&G (a similar method can split E&G into E and G).Repeat until l and r cross:Scan l to the right until finding an element > x.Scan r to the left until finding an element < x.Swap elements at indices l and r3 2 5 1 0 7 3 5 9 2 7 9 8 9 7 9 6lr(pivot = 6)3 2 5 1 0 7 3 5 9 2 7 9 8 9 7 9 6lr

40. Summary of Sorting Algorithms (so far)AlgorithmTimeNotesSelection SortO(n2)Slow, in-placeFor small data setsInsertion/Bubble SortO(n2) WC, ACO(n) BCSlow, in-placeFor small data setsHeap SortO(nlog n)Fast, in-placeFor large data setsQuick SortExp. O(nlogn) AC, BCO(n2) WCFastest, randomized, in-placeFor large data setsMerge SortO(nlogn)Fast, sequential data accessFor huge data sets

41. Handout ExerciseComplete Survey in EcampusQuestions?

42. Referenceshttp://faculty.cs.tamu.edu/schaefer/teaching/221_Fall2018/Lectures/Sorting.ppthttps://en.wikipedia.org/wiki/Insertion_sort

43. Bubble Sort 5 7 2 6 9 3

44. Bubble Sort 5 7 2 6 9 3

45. Bubble Sort 5 7 2 6 9 3

46. Bubble Sort 5 2 7 6 9 3

47. Bubble Sort 5 2 7 6 9 3

48. Bubble Sort 5 2 6 7 9 3

49. Bubble Sort 5 2 6 7 9 3

50. Bubble Sort 5 2 6 7 9 3

51. Bubble Sort 5 2 6 7 3 9

52. Bubble Sort 5 2 6 7 3 9

53. Bubble Sort 2 5 6 7 3 9

54. Bubble Sort 2 5 6 7 3 9

55. Bubble Sort 2 5 6 7 3 9

56. Bubble Sort 2 5 6 7 3 9

57. Bubble Sort 2 5 6 3 7 9

58. Bubble Sort 2 5 6 3 7 9

59. Bubble Sort 2 5 6 3 7 9

60. Bubble Sort 2 5 6 3 7 9

61. Bubble Sort 2 5 3 6 7 9

62. Bubble Sort 2 5 3 6 7 9

63. Bubble Sort 2 5 3 6 7 9

64. Bubble Sort 2 3 5 6 7 9

65. Bubble Sort 2 3 5 6 7 9