Top Banner
Time Complexity of Algorithms (Asymptotic Notations)
22

asymptotic notations i

Apr 12, 2017

Download

Technology

Ali Mahmood
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: asymptotic notations i

Time Complexity of Algorithms

(Asymptotic Notations)

Page 2: asymptotic notations i

• The level in difficulty in solving mathematically posed problems as measured by– The time

(time complexity)– memory space required– (space complexity)

What is Complexity?

Page 3: asymptotic notations i

1. CorrectnessAn algorithm is said to be correct if • For every input, it halts with correct output.• An incorrect algorithm might not halt at all OR• It might halt with an answer other than desired one.• Correct algorithm solves a computational problem2.Algorithm EfficiencyMeasuring efficiency of an algorithm, • do its analysis i.e. growth rate. • Compare efficiencies of different algorithms for the

same problem.

Major Factors in Algorithms Design

Page 4: asymptotic notations i

• Algorithm analysis means predicting resources such as– computational time– memory

• Worst case analysis– Provides an upper bound on running time– An absolute guarantee

• Average case analysis– Provides the expected running time– Very useful, but treat with care: what is “average”?

• Random (equally likely) inputs• Real-life inputs

Complexity Analysis

Page 5: asymptotic notations i

Asymptotic Notations Properties• Categorize algorithms based on asymptotic growth

rate e.g. linear, quadratic, exponential• Ignore small constant and small inputs • Estimate upper bound and lower bound on growth

rate of time complexity function• Describe running time of algorithm as n grows to .Limitations• not always useful for analysis on fixed-size inputs. • All results are for sufficiently large inputs.

Dr Nazir A. Zafar Advanced Algorithms Analysis and Design

Page 6: asymptotic notations i

Asymptotic Notations , O, , o, We use to mean “order exactly”, O to mean “order at most”, to mean “order at least”, o to mean “tight upper bound”, to mean “tight lower bound”,

Define a set of functions: which is in practice used to compare two function sizes.

Asymptotic Notations

Page 7: asymptotic notations i

.for boundupper

ally asymptotican is function means allfor , 0

such that and constants positiveexist there:functions, ofset theby denoted ,0function given aFor

nfngngnf

nnncgnfncnfng

ngng

o

o

Big-Oh Notation (O)

Intuitively: Set of all functions whose rate of growth is the same as or lower than that of g(n).

We may write f(n) = O(g(n)) OR f(n) O(g(n))

If f, g: N R+, then we can define Big-Oh as

Page 8: asymptotic notations i

g(n) is an asymptotic upper bound for f(n).

Big-Oh Notation

c > 0, n0 0 and n n0, 0 f(n) c.g(n)

f(n) O(g(n))

Page 9: asymptotic notations i

ExamplesExample 1: Prove that 2n2 O(n3)Proof:

Assume that f(n) = 2n2 , and g(n) = n3 f(n) O(g(n)) ?Now we have to find the existence of c and n0

f(n) ≤ c.g(n) 2n2 ≤ c.n3 2 ≤ c.nif we take, c = 1 and n0= 2 OR c = 2 and n0= 1 then 2n2 ≤ c.n3 Hence f(n) O(g(n)), c = 1 and n0= 2

Examples

Page 10: asymptotic notations i

ExamplesExample 2: Prove that n2 O(n2)Proof:

Assume that f(n) = n2 , and g(n) = n2 Now we have to show that f(n) O(g(n))

Since f(n) ≤ c.g(n) n2 ≤ c.n2 1 ≤ c, take, c = 1, n0= 1

Then n2 ≤ c.n2 for c = 1 and n 1Hence, 2n2 O(n2), where c = 1 and n0= 1

Examples

Page 11: asymptotic notations i

ExamplesExample 3: Prove that 1000.n2 + 1000.n O(n2)Proof: Assume that f(n) = 1000.n2 + 1000.n, and g(n) = n2 We have to find existence of c and n0 such that 0 ≤ f(n) ≤ c.g(n) n n0 1000.n2 + 1000.n ≤ c.n2 = 1001.n2, for c = 1001 1000.n2 + 1000.n ≤ 1001.n2

1000.n ≤ n2 n2 1000.n n2 - 1000.n 0 n (n-1000) 0, this true for n 1000 f(n) ≤ c.g(n) n n0 and c = 1001

Hence f(n) O(g(n)) for c = 1001 and n0 = 1000

Examples

Page 12: asymptotic notations i

ExamplesExample 4: Prove that n3 O(n2)Proof:

On contrary we assume that there exist some positive constants c and n0 such that 0 ≤ n3 ≤ c.n2 n n0 0 ≤ n3 ≤ c.n2 n ≤ cSince c is any fixed number and n is any arbitrary constant, therefore n ≤ c is not possible in general.Hence our supposition is wrong and n3 ≤ c.n2, n n0 is not true for any combination of c and n0. And hence, n3 O(n2)

Examples

Page 13: asymptotic notations i

.for boundlower

ally asymptotican is function that means , allfor 0

such that and constants positiveexist there:functions, ofset theby denote function given aFor

nfngngnf

nnnfncgncnfng

ngng

o

o

Big-Omega Notation ()

Intuitively: Set of all functions whose rate of growth is the same as or higher than that of g(n).

We may write f(n) = (g(n)) OR f(n) (g(n))

If f, g: N R+, then we can define Big-Omega as

Page 14: asymptotic notations i

Big-Omega Notation

g(n) is an asymptotically lower bound for f(n). c > 0, n0 0 , n n0, f(n) c.g(n)

f(n) (g(n))

Page 15: asymptotic notations i

ExamplesExample 1: Prove that 5.n2 (n)Proof: Assume that f(n) = 5.n2 , and g(n) = n

f(n) (g(n)) ?We have to find the existence of c and n0 s.t.

c.g(n) ≤ f(n) n n0

c.n ≤ 5.n2 c ≤ 5.nif we take, c = 5 and n0= 1 then

c.n ≤ 5.n2 n n0

And hence f(n) (g(n)), for c = 5 and n0= 1

Examples

Page 16: asymptotic notations i

ExamplesExample 2: Prove that 5.n + 10 (n)Proof: Assume that f(n) = 5.n + 10, and g(n) = n

f(n) (g(n)) ?We have to find the existence of c and n0 s.t.

c.g(n) ≤ f(n) n n0 c.n ≤ 5.n + 10 c.n ≤ 5.n + 10.n c ≤ 15.nif we take, c = 15 and n0= 1 then

c.n ≤ 5.n + 10 n n0And hence f(n) (g(n)), for c = 15 and n0= 1

Examples

Page 17: asymptotic notations i

ExamplesExample 3: Prove that 100.n + 5 (n2)Proof:

Let f(n) = 100.n + 5, and g(n) = n2 Assume that f(n) (g(n)) ?Now if f(n) (g(n)) then there exist c and n0 s.t.c.g(n) ≤ f(n) n n0 c.n2 ≤ 100.n + 5 c.n ≤ 100 + 5/n n ≤ 100/c, for a very large n, which is not possible

And hence f(n) (g(n))

Examples

Page 18: asymptotic notations i

Theta Notation ()

.for bound ally tightasymptotican is and factor,constant a within to toequal is function means

allfor 0 such that and , constants positiveexist there:

functions, ofset theby denoted function given aFor

21

21

nfngngnfngnf

nnngcnfngcnccnfng

ngng

o

o

Intuitively: Set of all functions that have same rate of growth as g(n).

We may write f(n) = (g(n)) OR f(n) (g(n))

If f, g: N R+, then we can define Big-Theta as

Page 19: asymptotic notations i

We say that g(n) is an asymptotically tight bound for f(n).

f(n) (g(n))

c1> 0, c2> 0, n0 0, n n0, c2.g(n) f(n) c1.g(n)

Theta Notation

Page 20: asymptotic notations i

Example 1: Prove that ½.n2 – ½.n = (n2)Proof

Assume that f(n) = ½.n2 – ½.n, and g(n) = n2 f(n) (g(n))? We have to find the existence of c1, c2 and n0 s.t.c1.g(n) ≤ f(n) ≤ c2.g(n) n n0

Since, ½ n2 - ½ n ≤ ½ n2 n ≥ 0 if c2= ½ and

½ n2 - ½ n ≥ ½ n2 - ½ n . ½ n ( n ≥ 2 ) = ¼ n2, c1= ¼

Hence ½ n2 - ½ n ≤ ½ n2 ≤ ½ n2 - ½ nc1.g(n) ≤ f(n) ≤ c2.g(n) n ≥ 2, c1= ¼, c2 = ½Hence f(n) (g(n)) ½.n2 – ½.n = (n2)

Theta Notation

Page 21: asymptotic notations i

Example 1: Prove that 2.n2 + 3.n + 6 (n3)Proof: Let f(n) = 2.n2 + 3.n + 6, and g(n) = n3

we have to show that f(n) (g(n))On contrary assume that f(n) (g(n)) i.e.there exist some positive constants c1, c2 and n0 such that: c1.g(n) ≤ f(n) ≤ c2.g(n) c1.g(n) ≤ f(n) ≤ c2.g(n) c1.n3 ≤ 2.n2 + 3.n + 6 ≤ c2. n3 c1.n ≤ 2 + 3/n + 6/n2 ≤ c2. n c1.n ≤ 2 ≤ c2. n, for large n n ≤ 2/c1 ≤ c2/c1.n which is not possibleHence f(n) (g(n)) 2.n2 + 3.n + 6 (n3)

Theta Notation

Page 22: asymptotic notations i

Usefulness of Notations• It is not always possible to determine behaviour of

an algorithm using Θ-notation. • For example, given a problem with n inputs, we may

have an algorithm to solve it in a.n2 time when n is even and c.n time when n is odd. OR

• We may prove that an algorithm never uses more than e.n2 time and never less than f.n time.

• In either case we can neither claim (n) nor (n2) to be the order of the time usage of the algorithm.

• Big O and notation will allow us to give at least partial information