8/10/2019 1 Time Complexity
1/37
Data Structures (CS 102)
Dr. Balasubramanian Raman
Associate Professor
Department of Computer Science and Engineering
Indian Institute of Technology Roorkee, [email protected]
http://people.iitr.ernet.in/facultywebsite/balarfma/Website/
Office: ECE, S 227 or MCA Block 103
8/10/2019 1 Time Complexity
2/37
CS 102
Syllabus
Real Time Applications
8/10/2019 1 Time Complexity
3/37
Why This Course?
You will be able to evaluate the quality of a program
(Analysis of Algorithms: Running time and memory
space )
You will be able to write fast programs
You will be able to solve new problems
You will be able to give non-trivial methods to solve
problems.
(Your algorithm (program) will be faster than others.)
8/10/2019 1 Time Complexity
4/37
Algorithm:A set of explicit, unambiguous finite steps, which
when carried out for a given set ofinitial condition toproduce the corresponding output and terminate infinite time.
Program:An implementation of an algorithm in some
programming languages
Data Structure:Organization of data needed to solve the problem
8/10/2019 1 Time Complexity
5/37
Good Algo.?
EfficientRunning Time
Space Used
Running time depends onSingle vs Multi processor
Read or Write speed to Memory
32 bit vs 64 bit
Input -> rate of growth of time, Efficiency as a functionof input (number of bits in an input number, numberof data elements)
8/10/2019 1 Time Complexity
6/37
Time Complexity: Amount of computation
time (CPU time) where program needs to run
Space complexity: Amount of memory
program needs to run for completion
8/10/2019 1 Time Complexity
7/37
Measuring the Running Time
The C standard library provides a functioncalled clock (in header file time.h) that cansometimes be used in a simple way to timecomputations:
clock_t start, finish;
start = clock();
sort(x.begin(), x.end());
// Call to STL generic sort algorithm
finish = clock();
cout
8/10/2019 1 Time Complexity
8/37
Limitations
It is necessary to implement and test the
algorithm in order to determine its running
time.
In order to compare two or more algorithms,
the same hardware and software
environments should be used.
8/10/2019 1 Time Complexity
9/37
How to Analyze Time Complexity?
Machine - Single Processor
- 32 bit
- Sequential executer- 1 unit time for
Arithmetic and Logical
Operations
- 1 unit time for
assignment and return
8/10/2019 1 Time Complexity
10/37
Few Examples
int Add(int a, int b)
{ return a+b;}
TAdd=1+1=2 units of
time
= Constant time
8/10/2019 1 Time Complexity
11/37
Sum of all elements in the list
int Sum_of_list(int A[], int n)
{int sum=0;
for (int i=0;i
8/10/2019 1 Time Complexity
12/37
TSum_of_list=1+3(n+1)+2n+1
=5n+5
=cn+c
Tsum_of_Matrices =a*n^2+b*n+c
TAdd =O(1)TSum_of_list = O(n)
Tsum_of_Matrices =O(n2)
Asymptotic Notation
8/10/2019 1 Time Complexity
13/37
Five Important Guidelines for finding
Time Complexity in a code
1. Loops
2. Nested Loops
3. Consecutive Statements
4. if else statements
5. Logarithmic statements
8/10/2019 1 Time Complexity
14/37
Asymptotic Notations O, ,, o,
Defined for functions over the natural numbers.
Ex:f(n) = (n2).
Describes howf(n) grows in comparison to n2
. Define a setof functions; in practice used to compare
two function sizes.
The notations describe different rate-of-growth
relations between the defining function and thedefined set of functions.
8/10/2019 1 Time Complexity
15/37
Asymptotic Notation
Big Oh notation (with a capital letter O, not a zero),also called Landau's symbol, is a symbolism used incomplexity theory, computer science, and mathematicsto describe the asymptotic behavior of functions.
Basically, it tells you how fast a function grows ordeclines.
Landau's symbol comes from the name of the German
number theoretician Edmund Landau who inventedthe notation. The letter O is used because the rate ofgrowth of a function is also called its order.
8/10/2019 1 Time Complexity
16/37
Big-Oh Notation (Formal Definition)
Given functionsf(n) and
g(n), we say thatf(n) is
O(g(n)) if there are
positive constants
c andn0
such that
f(n) cg(n) forn n0
Example: 2n 10 is O(n)2n 10cn
(c 2)n 10
n 10(c 2)
Pickc 3 andn0
10
8/10/2019 1 Time Complexity
17/37
Big-Oh Example
Example: the functionn2 is not O(n)
n2 cn
n c
The above inequality cannot be
satisfied sincec must be a
constant
n2 is O(n2).
8/10/2019 1 Time Complexity
18/37
More Big-Oh Examples
7n-27n-2 is O(n)
need c > 0 and n0 1 such that 7n-2 cn for n n0this is true for c = 7 and n0 = 1
3n3 + 20n2 + 53n3 + 20n2 + 5 is O(n3)
need c > 0 and n0 1 such that 3n3 + 20n2 + 5 cn3 for n n0
this is true for c = 4 and n0 = 21
3 log n + 5
3 log n + 5 is O(log n)
need c > 0 and n0 1 such that 3 log n + 5 clog n for n n0this is true for c = 8 and n0 = 2
8/10/2019 1 Time Complexity
19/37
Big-Oh and Growth Rate
The big-Oh notation gives an upper bound on the growth
rate of a function
The statement f(n) is O(g(n)) means that the growth rate
off(n) is no more than the growth rate ofg(n)
Useful to find a worst case of an algorithm
8/10/2019 1 Time Complexity
20/37
Prove that is125)(2
nnnf )( 2nO
8/10/2019 1 Time Complexity
21/37
Write an efficient program to find whether
given number is prime or not.
8/10/2019 1 Time Complexity
22/37
AKS algorithm
Agrawal, Kayal and Saxena from IIT Kanpur
come up with O(log n) algorithm for a prime
number program
PRIMES is in P, Annals of Mathematics, 160(2):
781-793, 2004
Infosys Mathematics Prize, 2008.
Fulkerson Prize for the paper PRIMES is in P, 2006. Gdel Prize for the paper PRIMES is in P", 2006.
Several awards and prizes
8/10/2019 1 Time Complexity
23/37
Programming Contest Sites
http://icpc.baylor.edu/public/worldMap/World-
Finals-2014 (ACM ICPC)
http://www.codechef.com (Online Contest) http://www.topcoder.com
http://www.spoj.com
http://www.interviewstreet.com
8/10/2019 1 Time Complexity
24/37
Find the output of the following code
int n=32;
steps=0;
for (int i=1; i
8/10/2019 1 Time Complexity
25/37
A , B both are of O(log n),
base doesnt matter,
Suppose and
)(log10 n )(log2 n
)(log nfnx mm )(log ngnx kk
nknm km xx
andkm xx
km
kxx mkm log
)(log)()( ncgkngnf m
kmlogcwhere
)()( ncgnf
n)O(ngnf logofare)(and)(
8/10/2019 1 Time Complexity
26/37
-notation
g(n) is an asymptot ic lower boundfor f(n).
Intuitively: Set of all functions
whose rate of growth is the
same as or higher than that ofg(n).
(g(n)) = {f(n) :positive constants cand n0, such
that n n0,we have 0
cg(n)
f(n)}
For function g(n), we define (g(n)),big-Omega of n, as the set:
8/10/2019 1 Time Complexity
27/37
Prove that is125)( 2 nnnf )( 2n
8/10/2019 1 Time Complexity
28/37
Example
n = (log n). Choose c and n0.for c=1 and n0 =16,
(g(n)) = {f(n) : positive constants c and n0, such thatn n0, we have 0 cg(n)f(n)}
nnc log* , n 16
8/10/2019 1 Time Complexity
29/37
8/10/2019 1 Time Complexity
30/37
Omega
Omega gives us a LOWER BOUND on a
function.
Big-Oh says, "Your algorithm is at least this
good."
Omega says, "Your algorithm is at least this
bad."
8/10/2019 1 Time Complexity
31/37
-notation
(g(n)) = {f(n) :positive constants c1, c2, and n0,
such that n n0,we have 0
c1g(n) f(n) c2g(n)
}
For function g(n), we define (g(n)),big-Theta of n, as the set:
g(n) is an asymptot ically t ight boundfor f(n).
Intuitively: Set of all functions that
have the same rate of growth as g(n).
8/10/2019 1 Time Complexity
32/37
-notation
(g(n)) = {f(n) :positive constants c1, c2, and n0,
such that n n0,we have 0
c1g(n) f(n) c2g(n)
}
For function g(n), we define (g(n)),big-Theta of n, as the set:
Technically,f(n) (g(n)).Older usage, f(n) = (g(n)).Ill accept either
f(n) and g(n) are nonnegative, for large n.
8/10/2019 1 Time Complexity
33/37
8/10/2019 1 Time Complexity
34/37
Example
Is 3n3 (n4) ??
How about 22n (2n)??
(g(n)) = {f(n) : positive constants c1, c2, and n0,
such that n n0, 0 c1g(n) f(n) c2g(n)}
8/10/2019 1 Time Complexity
35/37
Examples
3n2 + 17
(1), (n), (n2
) lower bounds
O(n2), O(n3), ... upper bounds
(n2) exact bound
8/10/2019 1 Time Complexity
36/37
Relations Between O,
8/10/2019 1 Time Complexity
37/37
o-notation
f(n) becomes insignificant relative to g(n) as napproaches infinity:
lim [f(n) / g(n)] = 0n
g(n) is an upper boundforf(n) that is notasymptotically tight.
Observe the difference in this definition fromprevious ones. Why?
o(g(n)) = {f(n): c> 0, n0 > 0 such thatn n0, we have 0 f(n) < cg(n)}.
For a given function g(n), the set little-o: