Top Banner
Discrete Structures CISC 2315 Growth of Functions
31

Discrete Structures CISC 2315 Growth of Functions.

Dec 15, 2015

Download

Documents

Cynthia Ilsley
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Discrete Structures CISC 2315 Growth of Functions.

Discrete Structures CISC 2315

Growth of Functions

Page 2: Discrete Structures CISC 2315 Growth of Functions.

Introduction

• Once an algorithm is given for a problem and decided to be correct, it is important to determine how much in the way of resources, such as time or space, that the algorithm will require.– We focus mainly on time in this course.– Such an analysis will often allow us to improve our

algorithms.

Page 3: Discrete Structures CISC 2315 Growth of Functions.

Running Time Analysis

100010101010001111100011000111010101010101010100100010101010001000000000000111101010001110100010101010101010101010101111111100000011001011

100010101010001111100011000111010101010101010100100010101010001000000000000111101010001110100010101010101010101010101111111100000011001011

N Data Items

N Data Items

Algorithm 1

Algorithm 2

Returns in time T1(N)

Returns in time T2(N)

Page 4: Discrete Structures CISC 2315 Growth of Functions.

Analysis of Running Time(which algorithm is better?)

Number of Input Items N

Run

ning

Tim

e T

(N)

Algorithm 1

Algorithm 2

0n

Page 5: Discrete Structures CISC 2315 Growth of Functions.

Comparing Algorithms

• We need a theoretical framework upon which we can compare algorithms.

• The idea is to establish a relative order among different algorithms, in terms of their relative rates of growth. The rates of growth are expressed as functions, which are generally in terms of the number/size of inputs N.

• We also want to ignore details, e.g., the rate of growth is N2 rather than 3N2 – 5N + 2.

NOTE: The text uses the variable x. We use N here.

Page 6: Discrete Structures CISC 2315 Growth of Functions.

Big-O Notation:

• Definition:

• This says that function T(N) grows at a rate no faster than f(N); thus c f(N) is an upper bound on T(N).

00 when such that and

constants positive are thereif ))(()(

nN c f(N)T(N) nc

NfONT

big-O

NOTE: We use |T(N)| < c|f(N)| if the functions could be negative.In this class, we will assume they are positive unless stated otherwise.

Page 7: Discrete Structures CISC 2315 Growth of Functions.

Big-O Upper Bound

Number of Input Items N

Run

ning

Tim

e T

(N)

c f(N)

T(N)

0n

Page 8: Discrete Structures CISC 2315 Growth of Functions.

Big-O Example

• Prove that – Since– Then

• We could also prove that but the first upper bound is tighter (lower).• Note that Why? How do you show something is not

O(something)?

)(25 323 NONN

)1(72525 33323 NforNNNNN

1and7with)(25 0323 ncNONN

)(25 423 NONN

)(25 223 NONN

Page 9: Discrete Structures CISC 2315 Growth of Functions.

Another Big-O Example

• Prove that – Since– Then

• We could also prove that

but the first bound is tighter.• Note that

Why?

)(12 22 NONN

)1(4212 22222 NforNNNNNN

1and4with)(12 022 ncNONN

)(12 32 NONN

)(122 NONN

Page 10: Discrete Structures CISC 2315 Growth of Functions.

Why Big-O?

• It gets very complicated comparing the time complexity of algorithms when you have all the details. Big-O gets rid of the details and focuses on the most important part of the comparison.

• Note that Big-O is a worst-case analysis.

Page 11: Discrete Structures CISC 2315 Growth of Functions.

Same-Order Functions

• Let f(N) = N2 + 2N + 1• Let g(N) = N2

• g(N) = O(f(N)) because N2 < N2 + 2N + 1• f(N) = O(g(N)) because:

– N2 + 2N + 1 < N2 + 2N2 + N2 = 4N2, which is O(N2) with c = 4 and n0 = 1.

• In this case we say that f(N) and g(N) are of the same order.

Page 12: Discrete Structures CISC 2315 Growth of Functions.

Simplifying the Big-O

• Theorem 1 (in text): – Let f(N) = anNn + an-1Nn-1 + … + a 1N + a 0

– Then f(N) is O(Nn).

Page 13: Discrete Structures CISC 2315 Growth of Functions.

Another Big-O Example

• Recall that N! = N*(N-1)*…*3*2*1 when N is a positive integer > 0, and 0! = 1.

• Then N! = N*(N-1)*…*3*2*1

< N*N*N* … *N

= NN

• Therefore, N! = O(NN). But this is not all we know…• Taking log of both sides, log N! < log NN = N log N.

Therefore log N! is O(N log N). (c=1, n0 = 2)

Recall that we assume logs are base 2.

Page 14: Discrete Structures CISC 2315 Growth of Functions.

Another Big-O Example

• In Section 3.2 it will be shown that N < 2N.• Taking the log of both sides, log N < N.• Therefore log N = O(N). (c=1 and n0 = 1)

Page 15: Discrete Structures CISC 2315 Growth of Functions.

Complexity terminology

• O(1) Constant complexity• O(log N) Logarithmic complexity• O(N) Linear complexity• O(N log N) N log N complexity• O(Nb) Polynomial complexity• O(bN), b > 1 Exponential complexity• O(N!) Factorial complexity

Page 16: Discrete Structures CISC 2315 Growth of Functions.

Growth of Combinations of Functions

• Sum Rule: Suppose f1(N) is O(g1(N)) and f2(N) is O(g2(N)). Then (f1 + f2 )(N) is O(max(g1(N), g2(N))).

• Product Rule: Suppose that f1(N) is O(g1(N)) and f2(N) is O(g2(N)). Then (f1f2) (N) is O(g1(N)g2(N)).

Page 17: Discrete Structures CISC 2315 Growth of Functions.

Growth of Combinations of Functions

O(g1(N)) O(g2(N))

Subprocedure 1Subprocedure 2

* What is the big-O time complexity of running the two subprocedures sequentially?

Theorem 2: If f1(N) is O(g1(N)) and f2(N) is O(g2(N)), then (f1 + f2)(N) = O(max(g1(N),g2(N)).

Page 18: Discrete Structures CISC 2315 Growth of Functions.

Growth of Combinations of Functions

O(g1(N)) O(g2(M))

Subprocedure 1Subprocedure 2

* What is the big-O time complexity of running the two subprocedures sequentially?

Theorem 2: If f1(N) is O(g1(N)) and f2(M) is O(g2(M)), then f1(N)+ f2(M) = O(g1(N) + g2(M)).

What about M not equal to N?

Page 19: Discrete Structures CISC 2315 Growth of Functions.

O(g2(N))

Growth of Combinations of Functions

O(g1(N))

* What is the big-O time complexity for nested loops?

Theorem 3: If f1(N) is O(g1(N)) and f2(N) is O(g2(N)), then (f1 * f2)(N) = O(g1(N) * g2(N)).

for j=1 to N do

for i=1 to 2N do

N steps * 2N steps = 2N2 steps

1 1

2N N

aij : = 1

1

Page 20: Discrete Structures CISC 2315 Growth of Functions.

Growth of Combinations of Functions: Example 1

• Give a big-O estimate for f(N) = 3N log(N!) + (N2 + 3) logN, where N is a positive integer.

• Solution:– First estimate 3N log(N!). Since we know log(N!) is O(N log

N), and 3N is O(N), then from the Product Rule we conclude 3N log(N!) = O(N2 log N).

– Next estimate (N2 + 3) log N. Since (N2 + 3) < 2N2 when N > 2, N2 + 3 is O(N2). From the Product Rule, we have (N2 + 3) log N = O(N2 log N).

– Using the Sum Rule to combine these estimates, f(N) = 3N log(N!) + (N2 + 3) log N is O(N2 log N).

Page 21: Discrete Structures CISC 2315 Growth of Functions.

Growth of Combinations of Functions: Example 2

• Give a big-O estimate for f(N) = (N + 1) log(N2 + 1) + 3N2.

• Solution:– First estimate (N + 1) log(N2 + 1) . We know (N + 1) is

O(N). Also, N2 + 1 < 2N2 when N > 1. Therefore:• log(N2 + 1) < log(2N2) = log2 + logN2 = log2 + 2logN < 3logN, if

N > 2. We conclude log(N2 + 1) = O(logN).– From the Product Rule, we have (N + 1)log(N2 + 1) = O(N

logN). Also, we know 3N2 = O(N2).– Using the Sum Rule to combine these estimates, f(N) =

O(max(N logN, N2). Since N log N < N2 for N >1, f(N) = O(N2).

Page 22: Discrete Structures CISC 2315 Growth of Functions.

Big-Omega Notation

• Definition:

• This says that function T(N) grows at a rate no slower than f(N); thus c f(N) is a lower bound on T(N).

00 when such that and

constants positive are thereif ))(()(

nN c f(N)T(N) nc

NfNT

Page 23: Discrete Structures CISC 2315 Growth of Functions.

Big-Omega Lower Bound

Number of Input Items N

Run

ning

Tim

e T

(N)

T(N)

f(N)

0n

Page 24: Discrete Structures CISC 2315 Growth of Functions.

Big-Omega Example

• Prove that – Since– Then

• We could also prove that

but the first lower bound is tighter (higher).

• Note that

)(32 22 NNN

)1(132 22 NforNNN

11)(32 022 nandcwithNNN

)(32 2 NNN

)(32 32 NNN

Page 25: Discrete Structures CISC 2315 Growth of Functions.

Another Big-Omega Example

• Prove that – Since– Then

• We could also prove that

but the first bound is tighter.• Note that

)( 2

1

NiN

i

)1(2/2/)1( 2

1

NforNNNiN

i

12/1)( 02

1

nandcwithNiN

i

)(1

NiN

i

)( 3

1

NiN

i

Page 26: Discrete Structures CISC 2315 Growth of Functions.

Big-Theta Notation• Definition:

– This says that function T(N) grows at the same rate as f(N).

• Put another way:

))(()(and))(()(

ifonly and if ))(()(

NfNTNfONT

NfNT

0

0

when

)(such that and , ,

constants positive are thereif ))(()(

nN

d f(N)T(N)Ncfndc

NfNT

We say that T(N) is of order f(N).

Page 27: Discrete Structures CISC 2315 Growth of Functions.

Big-Theta Example

• Show that 3N2 + 8N logN is θ(N2).– 0 < 8N log N < 8N2, and so 3N2 + 8N log N

< 11N2 for N > 1. – Therefore 3N2 + 8N logN is O(N2). – Clearly 3N2 + 8N logN is Ω(N2). – We conclude that 3N2 + 8N logN is θ(N2).

Page 28: Discrete Structures CISC 2315 Growth of Functions.

A Hierarchy of Growth Rates

NNN

k

NNNN

NNNNNNc

!32

loglogloglog32

2

If f(N) is O(x) for x one of the above, then it is O(y) for any y > x in the above ordering. But the higher bounds are not tight. We prefer tighter bounds.Note that if the hierarchy states that x < y, then obviously for any expression z > 0, z*x < z*y.

Page 29: Discrete Structures CISC 2315 Growth of Functions.

Complexity of Algorithms vs Problems

• We have been talking about polynomial, linear, logarithmic, or exponential time complexity of algorithms.

• But we can also talk about the time complexity of problems. Example decision problems:– Let a, b, and c be positive integers. Is there a positive

integer x < c such that x2 = a (mod b)?– Does there exist a truth assignment for all variables that can

satisfy a given logical expression?

Page 30: Discrete Structures CISC 2315 Growth of Functions.

Complexity of Problems• A problem that can be solved with a deterministic

polynomial (or better) worst-case time complexity algorithm is called tractable. Problems that are not tractable are intractable.

• P is the set of all problems solvable in polynomial time (tractable problems).

• NP is the set of all problems not solvable by any known deterministic polynomial time algorithm. But their solution can be checked in polynomial time.

P = NP ?

Page 31: Discrete Structures CISC 2315 Growth of Functions.

Complexity of Problems (cont’d)

• Unsolvable problem: A problem that cannot be solved by any algorithm. Example:– Halting Problem

?

program input

Will program halt on input?If we try running the program on the input, and it keeps running, how do we know if it will ever stop?