Top Banner
Algorithm Analysis: Running Time Big O and omega (
30

Algorithm Analysis: Running Time Big O and omega (

Jan 17, 2018

Download

Documents

Brett Kelley

The issues that should be considered in analyzing an algorithm are: The running time of a program as a function of its inputs The total or maximum memory space needed for program data The total size of the program code Whether the program correctly computes the desired result The complexity of the program. For example, how easy it is to read, understand, and modify the program The robustness of the program. For example, how well does it deal with unexpected or erroneous inputs
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Algorithm Analysis: Running Time Big O and omega (

Algorithm Analysis:

Running Time Big O and omega (

Page 2: Algorithm Analysis: Running Time Big O and omega (

Introduction

• An algorithm analysis of a program is a step by step procedure for accomplishing that program

• In order to learn about an algorithm, we need to analyze it

• This means we need to study the specification of the algorithm and draw conclusion about the implementation of that algorithm (the program) will perform in general

Page 3: Algorithm Analysis: Running Time Big O and omega (

• The issues that should be considered in analyzing an algorithm are:

• The running time of a program as a function of its inputs

• The total or maximum memory space needed for program data

• The total size of the program code

• Whether the program correctly computes the desired result

• The complexity of the program. For example, how easy it is to read, understand, and modify the program

• The robustness of the program. For example, how well does it deal with unexpected or erroneous inputs

Page 4: Algorithm Analysis: Running Time Big O and omega (

• In this course, we consider the running time of the algorithm.

• The main factors that effect the running time are the algorithm itself, input data, the computer system, etc.

• The performance of a computer is determined by• The hardware• The programming language used and • The operating system

• To calculate the running time of a general C++ program, we first need to define a few rules.

• In our rules , we are going to assume that the effect of hardware and software systems used in the machines are independent of the running time of our C++ program

Page 5: Algorithm Analysis: Running Time Big O and omega (

Rule 1:• The time required to fetch an integer from memory is a

constant t(fetch),• The time required to store an integer in memory is also a

constant t(store)

For example the running time of x = y

is:t(fetch) + t(store)

because we need to fetch y from memory store it into x

Similarly the running time of x = 1

is also t(fetch) + t(store) because typically any constant is stored in the memory before it is fetched.

Page 6: Algorithm Analysis: Running Time Big O and omega (

Rule 2 • The time required to perform elementary operations on integers,

such as addition t(+), subtraction t(-), multiplication t(*), division t(/), and comparison t(cmp), are all constants.

For Example the running time of y= x+1

is:2t(fetch) + t(store) + t (+)

because you need to fetch x and 1: 2*t(fetch) then add them together: t(+)and place the result into y: t(store)

Page 7: Algorithm Analysis: Running Time Big O and omega (

Rule 3: • The time required to call a function is a constant, t(call) • And the time required to return a function is a constant, t(return)

Rule 4:

• The time required to pass an integer argument to a function or procedure is the same as the time required to store an integer in memory, t(store)

Page 8: Algorithm Analysis: Running Time Big O and omega (

• For example the running time of y = f(x)

is:

t(fetch) + 2t(store) + t(call) + t(f(x))

Because you need • To fetch the value of x: t (fetch)• Pass x to the function and store it into parameter: t (store)• Call the function f(x): t (call)• Run the function: t (f(x))• Store the returned result into y: t (store)

Page 9: Algorithm Analysis: Running Time Big O and omega (

Rule 5:• The time required for the address calculation implied by an array

subscripting operation like a[i] is a constant, t([ ]).• This time does not include the time to compute the subscript

expression, nor does it include the time to access (fetch or store) the array element

For example, the running time of y = a[i]

is:3t(fetch) + t([ ]) + t(store)

Because you need To fetch the value of i: t(fetch)To fetch the value of a: t(fetch)To find the address of a[i]: t([ ])To fetch the value of a[i]: t(fetch)To store the value of a[i] into y: t(store)

Page 10: Algorithm Analysis: Running Time Big O and omega (

Rule 6: • The time required to calculate a fixed amount of storage from the

heap using operator new is a constant, t(new)• This time does not include any time required for initialization of

the storage (calling a constructor).

• Similarly, the time required to return a fixed amount of storage to the heap using operator delete is a constant, t(delete).

• This time does not include any time spent cleaning up the storage before it is returned to the heap (calling destructor)

For example the running time ofint* ptr = new int;

is:t(new) + t(store)

Because you need •To allocate a memory: t(new)•And to store its address into ptr: t(store)

For example the running time ofdelete ptr;

is:t(fetch ) + t(delete)

Because you need •To fetch the address from ptr : t(fetch)•And delete specifies location: t(delete)

Page 11: Algorithm Analysis: Running Time Big O and omega (

1. int Sum (int n)2. {3. int result =0;4. for (int i=1; i<=n; ++i)5. result += i;6. return result7. }

[6t(fetch) +2t(store) + t(cmp) + 2t(+)]*n+ [5t(fetch) + 2t(store) + t(cmp) + t(return) ]

TotalReturn resultt(fetch)+ t(return)6

Result +=i(2t(fetch)+t(+) +t(store)) *n5

++i(2t(fetch)+t(+) +t(store)) *n4ci<=n(2t(fetch)+t(cmp)) * (n+1)4bi = 1t(fetch) + t(store)4aresult = 0t(fetch)+ t(store)3CodeTimeStatement

Page 12: Algorithm Analysis: Running Time Big O and omega (

1. int func (int a[ ], int n, int x)2. {3. int result = a[n];4. for (int i=n-1; i>=0; --i)5. result =result *x + a[i];6. return result7. }

[(9t(fetch) +2t(store) + t(cmp) +t([]) + t(*) + t(-)]*n+ [(8t(fetch) + 2t(store) + t([]) + t(-) +t(cmp) + t (return) )]

Totalt(fetch)+ t(return)6(5t(fetch)+t([ ])+t(+)+t(*)+t(store)) *n5(2t(fetch)+t(-) +t(store)) *n4c(2t(fetch)+t(cmp)) * (n+1)4b2t(fetch) + t(-) + t(store)4a3t(fetch)+ t([ ]) + t(store)3

TimeStatement

Page 13: Algorithm Analysis: Running Time Big O and omega (

• Using constant times such as t(fetch), t(store), t(delete), t(new), t(+), …, ect makes our running time accurate

• However, in order to make life simple, we can consider the approximate running time of any constant to be the same time as t(1).

• For example, the running time of y = x + 1

is 3 because it includes two “fetches” and one “store” in which all are constants

• For a loop there are two cases:• If we know the exact number of iterations, the running time

becomes constant t(1)• If we do not know the exact number of iterations, the running

time becomes t(n) where n is the number of iterations

Page 14: Algorithm Analysis: Running Time Big O and omega (

1. int Geometric (int x, int n)2. {3. int sum = 0;4. for (int i=0; i<=n; ++i)5. {6. int prod = 1;7. for (int j=0; j<i; ++j)8. prod = prod * x;9. sum = sum + prod;10. }11. return result12. } 4(n+1)9

8

7c

7b2(n+1)7a2(n+1)6

(11/2)n2 + (47/2)n + 27Total211

4(n+1)4c3(n+2)4b24a23TimeStatement

i=0n4 i

i=0n4 i

i=0n3 i+1

Page 15: Algorithm Analysis: Running Time Big O and omega (

1. int Power (int x, int n)2. {3. if (n= =0)4. return 1;5. else if (n%2 = = 0) // n is even6. return Power (x*x, n/2);7. else // n is odd8. return x* Power (x*x, n/2);9. }

5---23n=0

18 + T( n/2 )-10 + T( n/2 )5-3n>0 (n is even)

12 + T( n/2 )820 + T( n/2 )Total

-655-433n>0 (n is odd)Statemen

t

Page 16: Algorithm Analysis: Running Time Big O and omega (

5 for n=0T(n) = 18+T( n/2 ) for n>0 and n is even

20+T( n/2 ) for n> 0 and n is odd

• Suppose n = 2k for some k>0.

• Obviously 2k is an even number, we get

T(2k) = 18 + T(2k-1)= 18 + 18 + T(2k-2)= 18 + 18 + 18 + T(2k-3)= …..= …..= 18k + T(2k-k)= 18k + T(20) = 18k + T(1)

Page 17: Algorithm Analysis: Running Time Big O and omega (

• Since T(1) is add number, the running time of T(1) is: T(1) = 20 + T(0) = 20 + 5 = 25

• Therefore, T(2k) = 18k + 25

• If n = 2k, then log n = log2k indicating that k = log n

• Therefore, T(n) = 18log n + 25

Page 18: Algorithm Analysis: Running Time Big O and omega (

Asymptotic Notation • Suppose the running time of two algorithms A and B are TA(n) and

TB(n), respectively where n is the size of the problem

• How can we determine TA(n) is better than TB(n)?

• One way to do that is if we know the size n ahead of time for some n=no. Then we may say that algorithm A is performing better than algorithm B for n= no

• But this is a special case for n=no. How about n = n1, or n=n2? Is A better than B for other cases too?

• Unfortunately, this is not an easy answer. We cannot expect that the size of n to be known ahead of time. But we may be able to say that under certain situations TA(n) is better than TB(n) for all n >= n1

Page 19: Algorithm Analysis: Running Time Big O and omega (

• To understand the running times of the algorithms we need to make some definitions:

• Definition:• Consider a function f(n) which is non-negative for all

integers n>=0. We say that “f(n) is big oh of g(n)” which we write (f(n) is O(g(n)) if there exists an integer no and a constant c > 0 such that for all integers n >=no, f(n) <=c g(n)

• Example: • Show that f(n) = 8n + 128 is O(n2)8n + 128 <= c n2 (lets set c = 1)

0 <= cn2 -8n -1280 <= (n-16) (n+8)

• Thus we can say that for constant c =1 and n >= 16, f(n) is O(n2)

Page 20: Algorithm Analysis: Running Time Big O and omega (

g3(n)=n2g2(n)=2n2g1(n)=4n2

f(n)=8n+128

5 10 15 20

200

400

n

f(n)

Page 21: Algorithm Analysis: Running Time Big O and omega (

Theorem:• If f1(n) is O(g1(n)) and f2(n) is O(g2(n)), then • f1(n) + f2(n) = O (max(g1(n), g2(n)))

Proof:

• If f1(n) is O(g1(n)) then f1(n) <= c1g1(n) for some n >= n1• If f2(n) is O(g2(n)) then f2(n) <= c2g2(n) for some n >= n2

• Let no= max(n1, n2) and co = 2(max(c1, c2)), consider the sum f1(n) + f2(n) for some n >= no

f1(n) + f2(n) < =c1g1(n) + c2g2(n)< = co(g1(n) + g2(n) )/2< =co (max (g1(n), g2(n))

Therefore, f1(n) + f2(n) is O (max(g1(n), g2(n)) )

Page 22: Algorithm Analysis: Running Time Big O and omega (

Theorem:• If f1(n) is O(g1(n)) and f2(n) is O(g2(n)), then

f1(n) * f2(n) = O(g1(n)*g2(n) )

Proof:

• If f1(n) is O(g1(n)) then f1(n) <= c1g1(n) for some n>=n1• If f2(n) is O(g2(n)) then f2(n) <= c2g2(n) for some n>=n2

• Let no= max(n1, n2) and co = c1*c2, consider the product of f1(n)*f2(n) for some n>=no

f1(n) * f2(n) <= c1g1(n) * c2g2(n)< = co(g1(n) * g2(n) )

• Therefore, f1(n) * f2(n) is O (g1(n)*g2(n) )

Page 23: Algorithm Analysis: Running Time Big O and omega (

Theorem:• If f(n) is O(g(n)) and g(n) is O(h(n)), then

f(n) is O(h(n))

Proof:

• If f(n) is O(g(n)) then f(n) <= c1g(n) for some n>=n1• If g(n) is O(h(n)) then g(n) <= c2h(n) for some n>=n2

• Let no= max(n1, n2) and co = c1*c2, then

f(n) <= c1g1(n)<= c1c2h(n)<= co h(n)

• Therefore, f(n) is O (h(n))

Page 24: Algorithm Analysis: Running Time Big O and omega (

The names of common big O expressions

QuadraticO(n2)nlognO (n*log n)LinearO (n)log squaredO(log2 n)logarithmicO (log n)ConstantO(1)

exponentialCubic

Name

O(2n)O(n3)

Expression

Page 25: Algorithm Analysis: Running Time Big O and omega (

Conventions for writing Big Oh Expression

• Certain conventions have evolved which concern how big oh expression normally written:

• First, it is common practice when writing big oh expression to drop all but the most significant items. Thus instead of O(n2 + nlogn + n) we simply write O(n2)

• Second, it is common practice to drop constant coefficients. Thus, instead of O(3n2), we write O(n2). As a special case of this rule, if the function is a constant, instead of, say O(1024), we simply write O(1)

Page 26: Algorithm Analysis: Running Time Big O and omega (

Asymptotic Lower Bound () Definition:• Consider a function f(n), which is non-negative for all

integers n>=0. We say that “f(n) is omega of g(n)” which we write (f(n) is (g(n)), if there exists an integer no and a constant c > 0 such that for all integers n>= no, f(n) >= c g(n)

Example: • Show that f(n) = 5n2 - 64n + 256 is (n2)

5n2 - 64n + 256 >= cn2 let c=1 5n2 - 64n + 256 >= n2

5n2 - 64n + 256 –n2 >= 0 4n2 - 64n + 256 >= 0 4(n-8)2 >= 0

Let no = 8, we can say that for c=1 and no>=8 f(n) is (n2)

Page 27: Algorithm Analysis: Running Time Big O and omega (

f(n)=n2

f(n)=2n2

f(n)=5n2-64n+256

5 10 15 20 25

500

1000

1500

Page 28: Algorithm Analysis: Running Time Big O and omega (

Other definitions Definition:• Consider a function f(n) which is non-negative for all

integers n>=0. We say that “f(n) is theta of g(n)” which we write (f(n) is (g(n)) if and only if f(n) is O(g(n)) and f(n) is (g(n))

Definition:• Consider a function f(n) which is non-negative for all

integers n>=0. We say that “f(n) is little o of g(n)” which we write (f(n) is o(g(n)) if and only if f(n) is O(g(n)) and f(n) is not (g(n))

• Now lets consider some of the previous examples in terms of the big O notations:

Page 29: Algorithm Analysis: Running Time Big O and omega (

1. int func (int a[ ], int n, int x)2. {3. int result = a[n];4. for (int i=n-1; i>=0; --i)5. result =result *x + a[i];6. return result7. }

16n + 1429n4n3n + 3

45

Simple Time model

O(n)TotalO(1)6O(n)5O(n)4cO(n)4bO(1)4aO(1)3

Big OStatementThe total running time is:

O(16n + 14) = O(max(16n, 14))= O(16n)= O(n)

Page 30: Algorithm Analysis: Running Time Big O and omega (

1. int PrefixSums (int a[ ], int n)2. {3. for (int j=n-1; i>=0; --j)4. {5. int sum = 0;6. for (int i=0; i<=j; ++i)7. sum = sum + a[i];8. a[j] = sum;9. }10. return result11. }

O(1)*O(n2)6bO(1)*O(n)6aO(1)*O(n)5

O(n2)TotalO(1)*O(n)9O(1)*O(n2)7O(1)*O(n2)6c

O(1)*O(n)3cO(1)*O(n)3aO(1)3a

Big OStatement