/
Data Structures & Algorithms Data Structures & Algorithms

Data Structures & Algorithms - PowerPoint Presentation

jane-oiler
jane-oiler . @jane-oiler
Follow
401 views
Uploaded On 2018-03-15

Data Structures & Algorithms - PPT Presentation

Spring 2018 Lecture 3 Recursion amp Sorting Recap Recursion Recursive Binary Search Input increasing sequence of n numbers A a 0 a 1 a n1 and value ID: 652546

log sort merge algorithm sort log algorithm merge recursion array insertion solution case loop running elements key sorting input

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "Data Structures & Algorithms" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Slide1

Data Structures & Algorithms

Lecture 3:

SortingSlide2

Recap: RecursionSlide3

Recursive Binary Search

Input

:

increasing sequence of n numbers

A = ‹

a

0

,

a1, …, an-1› and value vOutput: an index i such that A[i] = v or None if v not in ABinarySearch(A, v, x, y).if x < y h = (x + y)/2 if A[h] < v then return BinarySearch(A, v, h+1, y) else return BinarySearch(A, v, x, h) else if A[x] = v then return x else return None Initial call: BinarySearch(A, v, 0, n-1) Slide4

Algorithms

A complete description of an algorithm consists of

three

parts:

the algorithm, expressed in whatever way is clearest and most concise

a proof of the algorithm’s correctness,

for a recursive algorithm: (strong) induction

a

derivation of the algorithm’s running time for a recursive algorithm: solve a recurrenceSlide5

Example: Recursive Binary Search

Input

:

increasing sequence of n numbers

A = ‹

a

0

,

a1, …, an-1› and value vOutput: an index i such that A[i] = v or None if v not in ABinarySearch(A, v, x, y).if x < y h = (x + y)/2 if A[h] ≤ v then return BinarySearch(A, v, h+1, y) else return BinarySearch(A, v, x, h) else if A[x] = v then return x else return NIL Initial call: BinarySearch(A, v, 0, n-1) Slide6

Analysis of BinarySearch

Let

T(n)

be the worst case running time of BinarySearch on an array of length

n. We have

O(1)

if

n

= 1 T(n) = T( n/2 ) + O(1) if n > 1frequently omitted since it (nearly) always holdsoften written as T(n/2)Slide7

Solving recurrences (informal, more later)

Use one of the following methods

Substitution method:

guess the solution and use induction to prove that your guess it is correct.

carefully evaluate

recursion tree

use

Master

theoremcaveat: not always applicableHow to guess:expand the recursiondraw a recursion treeSlide8

Quiz

n

2

– 3n -18 = Θ(n2)

4 n log2 n - 20 n = O(n log n)

n4/3

= Ω(n log n )

17 n log

2 n = O(n2)43 n2 – 45 n log n = Θ(n2)An algorithm with worst case running time O(n log n) is always slower than an algorithm with worst case running time O(n) if n is sufficiently large.truefalsetruetruetruefalseSlide9

SortingSlide10

The sorting problem

Input

: a sequence of

n numbers

‹a

0

,

a

1, …, an-1›Output: a permutation of the input such that ‹ai1 ≤ … ≤ ain›The input is typically stored in arraysNumbers ≈ KeysAdditional information (satellite data) may be stored with keysWe will study several solutions ≈ algorithms for this problemCan you come up with a (fast) sorting algorithm? (3 min)81640395

01

3

4

5

6

8

9Slide11

Selection Sort

Probably the simplest sorting algorithm …

Selection-Sort

(

A, n)

Input: an array A and the number n of elements in A to sort

Output: the elements of A sorted into non-decreasing order

For

i = 0 to n-2:Set smallest to iFor j = i + 1 to n-1If A[j] < A[smallest], then set smallest to jSwap A[i] with A[smallest]12937141139127141137129

14

11

3

7

9

12

14

11

3

7

9

11

14

12

3

7

9

11

12

14Slide12

Selection Sort

Selection-Sort

(

A, n)

Input: an array A and the number n of elements in A to sort

Output: the elements of A sorted into non-decreasing order

For

i =

0 to n-2:Set smallest to iFor j = i + 1 to n-1If A[j] < A[smallest], then set smallest to jSwap A[i] with A[smallest]Correctness? Running time?O(n2)Loop Invariant Proof ✔Slide13

Insertion Sort

Like sorting a hand of playing cards:

start with empty left hand, cards on table

remove cards one by one, insert into correct position

to find position, compare to cards in hand from right to leftcards in hand are always sorted

Insertion Sort isa good algorithm to sort a small number of elementsan incremental algorithm

Incremental algorithms

process the input elements one-by-one and maintain the solution for the elements processed so far.Slide14

Incremental algorithms

Incremental algorithms

process the input elements one-by-one and maintain the solution for the elements processed so far.

In

pseudocode:

IncAlg(A)

// incremental algorithm which computes the solution of a problem

with input A = {

x0,…,xn-1} initialize: compute the solution for {x0} for j = 1 to n-1 do compute the solution for {x0,…,xj} using the (already computed) solution for {x0,…,xj-1}Slide15

Insertion Sort

Insertion-Sort

(A)

// incremental algorithm that sorts array

A[0:n] in non-decreasing order

initialize: sort A[0] for

j

=

1 to len(A)-1 do sort A[0:j+1] using the fact that A[0:j] is already sortedNote: A[i:j] denotes the array A restricted to the indices ≥ i, and < j (not including j!)A[:j] is short for A[0:j]A[i:] is short for A[i:len(A)]Slide16

Insertion Sort

Insertion-Sort

(A)

// incremental algorithm that sorts array

A[0:n] in non-decreasing order

initialize: sort A[0] for

j

=

1 to len(A)-1 do key = A[j] i = j -1 while i >= 0 and A[i] > key do A[i+1] = A[i] i = i -1 A[i +1] = key1 3 14 17 28 6 …0 j n-1

Insertion Sort is an

in place

algorithm:

the numbers are rearranged within the

array with only constant extra space.

For a visualization of sorting algorithms see

https://

visualgo.net/en/sortingSlide17

CorrectnessSlide18

Correctness proof

Loop invariant

At the start of each iteration of the “outer”

for loop (indexed by j) the subarray

A[0:j] consists of the elements originally in A[0:j] but in sorted order.Slide19

Correctness proof

Insertion-Sort

(A)

initialize: sort

A[0] for

j = 1 to

len

(A)-1

do key = A[j] i = j -1 while i >= 0 and A[i] > key do A[i+1] = A[i] i = i -1 A[i +1] = keyInitializationJust before the first iteration, j = 1 ➨ A[0:j] = [A[0]], which is the element originally in [A[0]], and it is trivially sorted.Loop invariant At the start of each iteration of the “outer” for loop (indexed by j) the subarray A[0:j] consists of the elements originally in A[0:j] but in sorted order.Slide20

Correctness proof

Insertion-Sort

(A)

initialize: sort

A[0] for

j = 1 to

len

(A)-1

do key = A[j] i = j -1 while i >= 0 and A[i] > key do A[i+1] = A[i] i = i -1 A[i +1] = keyMaintenanceStrictly speaking need to prove loop invariant for “inner” while loop. Instead, note that body of while loop moves A[j-1], A[j-2], A[j-3], and so on, by one position to the right until proper position of key is found (which has value of A[j]) ➨ invariant maintained.Loop invariant At the start of each iteration of the “outer” for loop (indexed by j) the subarray A[0:j] consists of the elements originally in A[0:j] but in sorted order.Slide21

Correctness proof

Insertion-Sort

(A)

initialize: sort A[0]

for j

= 1 to len(A

)-1

do key = A[j] i = j -1 while i >= 0 and A[i] > key do A[i+1] = A[i] i = i -1 A[i +1] = keyTerminationThe outer for loop ends when j = n. Plug n for j in the loop invariant ➨ the subarray A[0:n] consists of the elements originally in A[0:n] in sorted order.Loop invariant At the start of each iteration of the “outer” for loop (indexed by j) the subarray A[0:j] consists of the elements originally in A[0:j] but in sorted order.Slide22

Another sorting algorithm

using a different paradigm …Slide23

Merge Sort

A

divide-and-conquer

sorting algorithm.Divide

the problem into a number of subproblems that are smaller instances of the same problem.

Conquerthe

subproblems

by solving them

recursively. If they are small enough, solve the subproblems as base cases.Combinethe solutions to the subproblem into the solution for the original problem.Slide24

D&CAlg

(A)

//

divide-and-conquer algorithm that computes the solution of a problem with input A = {x

0,…,xn-1}

if # elements of A is small enough (for example 1)

then

compute Sol (the solution for A) brute-force else split A in, for example, 2 non-empty subsets A1 and A2 Sol1 = D&CAlg(A1) Sol2 = D&CAlg(A2) compute Sol (the solution for A) from Sol1 and Sol2 return SolDivide-and-conquerSlide25

Merge-Sort

(A)

//

divide-and-conquer algorithm that sorts array A[0:n

] if len

(A) == 1 then

compute Sol (the solution for A) brute-force

else split A in 2 non-empty subsets A1 and A2 Sol1 = Merge-Sort(A1) Sol2 = Merge-Sort(A2) compute Sol (the solution for A) from Sol1 en Sol2Merge SortSlide26

Merge-Sort

(A)

//

divide-and-conquer algorithm that sorts array A[1..n]

if len(A) == 1

then skip

else

n = len(A) ; n1 = n/2 ; n2 = n/2 ; copy A[0:n1] to auxiliary array A1[0:n1] copy A[n1:n] to auxiliary array A2[n1:n] Merge-Sort(A1) Merge-Sort(A2) Merge(A, A1, A2)Merge SortSlide27

Merge Sort

3

14

1

28

17

8

21

74351347814172128358217

4

35

3

14

1

28

17

1

3

14

17

28

4

7

8

21

35

3

14

3

14

1

28

17

1

17

28

3

14Slide28

Merge Sort

Merging

1

3

4

7

8

14

17212835A13141728A14782135

A

2

Correctness?Slide29

EfficiencySlide30

Analysis of Insertion Sort

Insertion-Sort

(A)

initialize: sort

A[0] for

j = 1 to

len

(A)-1

do key = A[j] i = j -1 while i >= 0 and A[i] > key do A[i+1] = A[i] i = i -1 A[i +1] = keyGet as tight a bound as possible on the worst case running time. ➨ lower and upper bound for worst case running timeUpper bound: Analyze worst case number of elementary operationsLower bound: Give “bad” input exampleSlide31

Analysis of Insertion Sort

Insertion-Sort

(A)

initialize: sort

A[0]

for

j

= 1 to len(A)-1 do key = A[j] i = j -1 while i >= 0 and A[i] > key do A[i+1] = A[i] i = i -1 A[i +1] = keyUpper bound: Let T(n) be the worst case running time of InsertionSort on an array of length n. We haveT(n) =Lower bound: O(1)O(1)O(1)worst case:(j-1) ∙ O(1)O(1) +{ O(1) + (j-1)∙O(1) + O(1) }= O(j)= O(n2)

Array sorted in de-creasing order

➨Ω(n2)

The

worst case

running time of InsertionSort is

Θ

(n

2

)

.

 

 Slide32

Merge-Sort

(A)

//

divide-and-conquer algorithm that sorts array

A[0:n

] if

A.length

= 1 then skip else n = len(A) ; n1 = floor(n/2); n2 = ceil(n/2); copy A[0:n1] to auxiliary array A1[0:n1] copy A[n1:n] to auxiliary array A2[0:n2] Merge-Sort(A1); Merge-Sort(A2) Merge(A, A1, A2)Analysis of Merge SortO(1)O(1)O(n)O(n)O(n)??T( n/2 ) + T( n/2 )

MergeSort is a

recursive

algorithm

running time analysis leads to

recursionSlide33

Analysis of Merge Sort

Let

T(n)

be the worst case running time of MergeSort on an array of length n

. We have

O(1) if

n

=

1 T(n) = T( n/2 ) + T( n/2 ) + Θ(n) if n > 1 ➨ T(n) = 2 T(n/2) + Θ(n) ➨ T(n) = Θ(n log n)frequently omitted since it (nearly) always holdsoften written as 2T(n/2)

Master theoremSlide34

Solving recurrencesSlide35

Solving recurrences

Use one of the following methods

Substitution method:

guess the solution and use induction to prove that your guess it is correct.

carefully evaluate

recursion tree

use

Master

theoremcaveat: not always applicableHow to guess:expand the recursiondraw a recursion treeSlide36

Let

a

and

b be constants, let f(n) be a function, and let T(n)be defined on the nonnegative integers by the recurrence

T(n) = aT(n/

b) + f(n)

Then we have:

If

f(n) = O(nlog a – ε) for some constant ε > 0, then T(n) = Θ(nlog a).If f(n) = Θ(nlog a), then T(n) = Θ(nlog a log n) If f(n) = Ω(nlog a + ε) for some constant ε > 0, and if af(n/b) ≤ cf(n) for some constant c < 1 and all sufficiently large n, then T(n) = Θ(f(n))The master theorembbbbcan be rounded up or downnote: logba - ε bSlide37

The master theorem: Example

T(n) =

4

T(n/2) + Θ(n

3)

Master theorem with a = 4, b

=

2

, and f(n) = n3 logba = log24 = 2 ➨ n3 = f(n) = Ω(nlog a + ε) = Ω(n2 + ε) with, for example, ε = 1Case 3 of the master theorem gives T(n) = Θ(n3), if the regularity condition holds. choose c = ½ and n0 = 1 ➨ af(n/b) = 4(n/2)3 = n3/2 ≤ cf(n) for n ≥ n0 ➨ T(n) = Θ(n3)bSlide38

Solving recurrences

Use one of the following methods

Substitution method:

guess the solution and use induction to prove that your guess it is correct.

carefully evaluate

recursion tree

use

Master

theoremcaveat: not always applicableHow to guess:expand the recursiondraw a recursion treeSlide39

Recursion-trees

T(n) = 2T(n/2) + n

n

n/2 n/2

n/4 n/4 n/4 n/4

n/2

i

n/2

i … n/2iΘ(1) Θ(1) … Θ(1)Slide40

Recursion-trees

T(n) = 2T(n/2) + n

n

2

∙ (n/2) = n

4

∙ (n/4) = n

2

i ∙ (n/2i) = nn ∙ Θ(1) = Θ(n)log n+Θ(n log n)nn/2 n/2n/4 n/4 n/4 n/4n/2i n/2i … n/2iΘ(1) Θ(1) … Θ(1)Slide41

Recursion-trees

T(n) = 2T(n/2) + n

2

n

2

(n/2)

2

(n/2)2(n/4)2 (n/4)2 (n/4)2 (n/4)2(n/2i)2 (n/2i)2 … (n/2i)2Θ(1) Θ(1) … Θ(1)Slide42

Recursion-trees

T(n) = 2T(n/2) + n

2

n

2

(n/2)

2

(n/2)2(n/4)2 (n/4)2 (n/4)2 (n/4)2(n/2i)2 (n/2i)2 … (n/2i)2Θ(1) Θ(1) … Θ(1)

n

2

2

∙ (n/2)

2

= n

2

/2

4

∙ (n/4)

2

= n

2

/4

2

i

∙ (n/2

i

)

2

= n

2

/2

i

n

Θ

(1) =

Θ

(

n)

+

Θ

(n

2

)Slide43

Recursion-trees

T(n) = 4T(n/2) + n

n

n/2 n/2 n/2 n/2

… n/4 n/4 n/4 n/4 …

Θ

(1)

Θ

(1) … Θ(1)Slide44

Recursion-trees

T(n) = 4T(n/2) + n

n

n/2 n/2 n/2 n/2

… n/4 n/4 n/4 n/4 …

Θ

(1)

Θ

(1) … Θ(1)n4 ∙ (n/2) = 2n16 ∙ (n/4) = 4nn2 ∙ Θ(1) = Θ(n2)+Θ(n

2)Slide45

Let

a

and

b be constants, let f(n) be a function, and let T(n)

be defined on the nonnegative integers by the recurrence T(n) =

aT(n/b) + f(n)

Then we have:

If

f(n) = O(nlog a – ε) for some constant ε > 0, then T(n) = Θ(nlog a). Rec.Tree: contribution per level increases, bottom level dominatesIf f(n) = Θ(nlog a), then T(n) = Θ(nlog a log n) Rec.Tree: all levels contribute, T(n) = cost per level * no.levelsIf f(n) = Ω(nlog a + ε) for some constant ε > 0, and if af(n/b) ≤ cf(n) for some constant c < 1 and all sufficiently large n, then T(n) = Θ(f(n)) Rec.Tree: contribution per level decreases, top level dominatesThe master theorem, intuitionbbbbbSlide46

The substitution method

Claim

: T(n) = O(n log n)

Proof

: by induction on

n

to show:

there

are constants c and n0 such that T(n) ≤ c n log n for all n ≥ n0 n = 1 ➨ T(1) = 2 ≤ c 1 log 1 n = n0 = 2 is a base case Need more base cases? Base cases: n = 2: T(2) = 2T(1) + 2 = 2∙2 + 2 = 6 = c 2 log 2 for c = 3 n = 3: T(3) = 2T(1) + 3 = 2∙2 + 3 = 7 ≤ c 3 log 3 T(n) =2 if n = 12T( n/2 ) + n if n > 13/2 = 1, 4/2 = 2 ➨ 3 must also be base case

n

0

= 2Slide47

The substitution method

Claim

: T(n) = O(n log n)

Proof

: by induction on

n

to show:

there are constants

c and n0 such that T(n) ≤ c n log n for all n ≥ n0 choose c = 3 and n0 = 2 Inductive step: n > 3 T(n) = 2T( n/2 ) + n ≤ 2 c n/2 log n/2 + n (ind. hyp.) ≤ c n ((log n) - 1) + n ≤ c n log n ■T(n) =2 if n = 12T( n/2 ) + n if n > 1Slide48

The substitution method (How not to …)

Claim

: T(n) = O(n)

Proof

: by induction on

n

Base case: n =

n0 T(2) = 2T(1) + 2 = 2c + 2 = O(2) Inductive step: n > n0 T(n) = 2T( n/2 ) + n = 2O( n/2 ) + n (ind. hyp.) = O(n) ■ T(n) =Θ(1) if n = 12T( n/2 ) + n if n > 1

Never use O, Θ, or Ω in a proof by induction!

Slide49

Tips

Analysis of recursive algorithms:

find the recursion and solve with master theorem if possible

Analysis of loops: summations

Some standard recurrences and sums:T(n) = 2T(n/2) + Θ

(n) ➨

½ n(n+1) = Θ(n

2

) Θ(n3)T(n) = Θ(n log n)Slide50

Rate of growth

Insertion Sort

:

15 n

2 + 7n – 2 Merge Sort:

300 n lg n + 50 n

n

=10 n=100 n=1000 1568 150698 1.5 x 10710466 204316 3.0 x 106Insertion Sort6 x fasterInsertion Sort1.35 x fasterMerge Sort5 x fastern = 1,000,000 Insertion Sort 1.5 x 1013 Merge Sort 6 x 109 2500 x faster !The rate of growth of the running time as a function of the input is essential!Slide51

Sorting algorithms

We focus on running time

Storage?

all sorting algorithms discussed use O(n) storagein place: only constant amount of extra storage

in place & O(n log n)? … later in the course

Θ

(n

2

)yesΘ(n log n)noworst case running timein placeSelection SortInsertion SortMerge SortΘ(n2)yesSlide52

Recap and preview

Today

Recursion again

Sorting AlgorithmsIncremental Algorithms, Divide&Conquer Algorithms

Solving recurrencesMaster TheoremRecursion TreesSubstitution Method