Instructor SNTAZI ASSISTANT PROFESSOR DEPTT CSE GEC AJMER satyataziecajmeracin Asymptotic Complexity Running time of an algorithm as a function of input size n for large ID: 830048
Download The PPT/PDF document "Analysis of Algorithms Asymptotic Notati..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.
Slide1
Analysis of Algorithms
Asymptotic Notation Instructor: S.N.TAZI ASSISTANT PROFESSOR ,DEPTT CSEGEC AJMERsatya.tazi@ecajmer.ac.in
Slide2Asymptotic Complexity
Running time of an algorithm as a function of input size n for large n.Expressed using only the highest-order term in the expression for the exact running time.Instead of exact running time, say Q(n
2
).
Describes behavior of function in the limit.
Written using
Asymptotic Notation
.
Slide3Asymptotic Notation
Q, O, W, o
,
w
Defined for functions over the natural numbers.
Ex:
f
(
n
) =
Q
(
n
2
).
Describes how
f
(
n
) grows in comparison to
n
2
.
Define a
set
of functions; in practice used to compare two function sizes.
The notations describe different rate-of-growth relations between the defining function and the defined set of functions.
Slide4-notation
(
g
(
n
)) =
{
f(n) : positive constants c1, c2, and n0, such that n n0,we have 0 c1g(n) f(n) c2g(n)}
For function g(n), we define (g(n)), big-Theta of n, as the set:
g(n) is an asymptotically tight bound for f(n).
Intuitively
: Set of all functions that
have the same
rate of growth
as
g
(
n
).
Slide5-notation
(
g
(
n
)) =
{
f(n) : positive constants c1, c2, and n0, such that n n0,we have 0 c1g(n) f(n) c2g(n)}
For function g(n), we define (g(n)), big-Theta of n, as the set:
Technically, f(n) (g(n)).
Older usage,
f
(
n
)
= (g(n)).I’ll accept either…
f
(
n
) and
g
(
n
) are nonnegative, for large
n
.
Slide6Example
10n2 - 3n = Q(n2)What constants for n0, c
1
, and
c
2
will work?
Make
c
1 a little smaller than the leading coefficient, and c2 a little bigger.To compare orders of growth, look at the leading term.Exercise: Prove that n2/2-3n= Q(n2)(g(n)) = {f(n) : positive constants c1, c2, and n0, such that n n0, 0 c1
g(n) f(n)
c2g(n)}
Slide7Example
Is 3n3 Q(n4) ??How about 22n Q
(2
n
)??
(
g
(
n)) = {f(n) : positive constants c1, c2, and n0, such that n n0, 0 c1g(n) f(n) c2g
(n)}
Slide8O-notation
O
(
g
(
n
)) =
{
f(n) : positive constants c and n0, such that n n0,we have 0 f(n) cg(n) }For function g(n), we define O(g(n)), big-O of n, as the set:
g(n) is an asymptotic upper bound for f(n).Intuitively: Set of all functions whose
rate of growth is the same as or lower than that of g(n).f(n) =
(
g
(
n
))
f
(
n
) =
O
(
g
(
n
)).
(
g
(
n
))
O
(
g
(
n
)).
Slide9Examples
Any linear function an + b is in O(n2). How?Show that 3n3=O
(
n
4
) for appropriate
c
and
n
0.O(g(n)) = {f(n) : positive constants c and n0, such that n n0, we have 0 f(n) cg(n) }
Slide10 -notation
g(n) is an asymptotic lower bound for f(n).
Intuitively
: Set of all functions whose
rate of growth
is the same as or higher than that of
g
(
n
).f(n) = (g(n)) f(n) = (g(n)).(g(n)) (g(n)).(g(n)) = {f(
n) : positive constants c and n0, such that
n n0,we have 0
c
g
(
n)
f
(
n
)
}
For function
g
(
n
), we define
(
g
(
n
)), big-Omega of
n
, as the set:
Slide11Example
n = (lg n). Choose c and n0
.
(
g
(
n
)) =
{f(n) : positive constants c and n0, such that n n0, we have 0 cg(n) f(n)}
Slide12Relations Between Q, O,
W
Slide13Relations Between Q,
W, OI.e., (g(n)) = O(g(
n
))
Ç
W
(
g
(n))In practice, asymptotically tight bounds are obtained from asymptotic upper and lower bounds.Theorem : For any two functions g(n) and f(n), f(n) = (g(n)) iff f(n) = O(g(n)) and f(n) = (g(n)).
Slide14Running Times
“Running time is O(f(n))” Þ Worst case is O(f(n))
O
(
f
(
n
)) bound on the worst-case running time
O(f(n)) bound on the running time of every input.Q(f(n)) bound on the worst-case running time Q(f(n)) bound on the running time of every input.“Running time is W(f(n))” Þ Best case is W(f(n)) Can still say “Worst-case running time is W(f(n))”Means worst-case running time is given by some unspecified function g(n) Î W(f(n)).
Slide15Example
Insertion sort takes Q(n2) in the worst case, so sorting (as a problem) is O(n2).
Why?
Any sort algorithm must look at each item, so sorting is
W
(
n
).
In fact, using (e.g.) merge sort, sorting is
Q(n lg n) in the worst case.Later, we will prove that we cannot hope that any comparison sort to do better in the worst case.
Slide1615
Insertion Sort Algorithm (cont.)INSERTION-SORT(A)for j = 2 to length[A]
do
key
A[
j
] //insert A[j] to sorted sequence A[1..j-1] i j-1 while i >0 and A[i]>key do A[i+1] A[i] //move A[i] one position right i i-1 A[i+1] key
Slide1716
Correctness of Insertion Sort AlgorithmLoop invariantAt the start of each iteration of the for loop, the subarray A[1..j-1] contains original A[1..j-1] but in sorted order.Proof:
Initialization :
j
=2, A[1..
j
-1]=A[1..1]=A[1], sorted.
Maintenance: each iteration maintains loop invariant.
Termination:
j=n+1, so A[1..j-1]=A[1..n] in sorted order.
Slide1817
Analysis of Insertion Sort INSERTION-SORT(A) cost times
for
j
= 2 to length[A]
c
1
n do key A[j] c2 n-1 //insert A[j] to sorted sequence A[1..j-1] 0 n-1 i j-1 c4 n-1 while i >0 and A[i]>key c5 j=2n tj do A[i+1] A[
i] c6 j=2n(tj –1)
i i-1 c7 j
=2
n
(
t
j
–1)
A[
i
+1]
key
c
8
n
–
1
(
t
j
is the number of times the while loop test in line 5 is executed for that value of
j
)
The total time cost
T
(
n
) = sum of
cost
times
in each line
=
c
1
n
+
c
2
(
n-
1) +
c
4
(
n-
1) +
c
5
j
=2
n
t
j
+
c
6
j
=2
n
(
t
j
-1)+
c
7
j
=2
n
(
t
j
-1)+
c
8
(
n-
1)
Slide1918
Analysis of Insertion Sort (cont.)Best case cost: already ordered numberstj=1, and line 6 and 7 will be executed 0 times
T
(
n
) =
c
1
n
+ c2(n-1) + c4(n-1) + c5(n-1) + c8(n-1) =(c1 + c2 + c4 + c5 + c8)n – (c2 + c4 + c5 + c8) = cn + c‘ Worst case cost: reverse ordered numberstj=j, so j=2n tj = j=2n j =n(n+1)/2-1, and j=2n
(tj –1) = j=2n(j –1) = n(n
-1)/2, and T(n) = c1n + c2(n-1) + c4
(
n-
1) +
c
5
(n(n+1)/2 -1) + + c6
(
n
(
n-
1)/2
-
1) +
c
7
(
n
(
n-
1)/2)+
c
8
(
n-
1) =((
c
5
+
c
6
+
c
7
)/2)
n
2
+(
c
1
+
c
2
+
c
4
+
c
5
/2-
c
6
/2-
c
7
/2+
c
8
)
n
-(
c
2
+
c
4
+
c
5
+
c
8
) =
an
2
+
bn
+
c
Average case cost: random numbers
i
n average,
t
j
=
j
/2.
T
(
n
) will still be in the order of
n
2
, same as the worst case.
Slide20Asymptotic Notation in Equations
Can use asymptotic notation in equations to replace expressions containing lower-order terms.For example,4n3 + 3n2 + 2n + 1 = 4n3
+ 3
n
2
+
(
n
) = 4n3 + (n2) = (n3). How to interpret?In equations, (f(n)) always stands for an anonymous function g(n) Î (f(n))In the example above, (n2) stands for 3n2 + 2n + 1.
Slide21o-notation
f(n) becomes insignificant relative to g(n) as n approaches infinity:
lim
[
f
(
n
) /
g(n)] = 0 n g(n) is an upper bound for f(n) that is not asymptotically tight.Observe the difference in this definition from previous ones. Why?o(g(n)) = {f(n): c > 0, n0 > 0 such that
n n0, we have 0
f(n) < cg(
n
)}.
For a given function
g
(
n), the set little-
o
:
Slide22w
(g(n)) = {
f
(
n
):
c
> 0, n0 > 0 such that n n0, we have 0 cg(n) < f(n)}.w -notationf(n) becomes arbitrarily large relative to g(n) as n approaches infinity: lim [f(n) / g(n)] =
. n g(n) is a lower bound
for f(n) that is not asymptotically tight.
For a given function
g
(
n
), the set little-omega:
Slide23Comparison of Functions
f g a bf (n)
= O
(
g
(
n
))
a bf (n) = (g(n)) a bf (n) = (g(n)) a = bf (n) = o(g(n)) a < bf (n) = w (g(n)) a > b
Slide24Limits
lim [f(n) / g(n)] = 0 Þ f
(
n
)
Î
o
(g(n)) nlim [f(n) / g(n)] < Þ f(n) Î O(g(n)) n0 < lim [f(n) / g(n)] < Þ f(n) Î Q(g(n))
n0 < lim [f(n) / g(n)] Þ
f(n) Î W(g(n))
n
lim
[
f(n
) /
g
(
n
)] =
Þ
f
(
n
)
Î
w
(
g
(
n
))
n
lim
[
f
(
n
) /
g
(
n
)]
undefined
Þ
can’t say
n
Slide25Properties
Symmetry f(n) = (g
(
n
))
iff
g
(
n) = (f(n)) Complementarity f(n) = O(g(n)) iff g(n) = (f(n)) f(n) = o(g(n)) iff g(n) = w((f(n))