8/13/2019 Algorithm Analysis Big Oh
1/15
Dale Roberts
Department of Computer and Information Science,
School of Science, IUPUI
Dale Roberts, Lecturer
Computer Science, IUPUI
E-mail: [email protected]
CSCI 240
Analys is of Algor ithms
Big-Oh
8/13/2019 Algorithm Analysis Big Oh
2/15
Dale Roberts
Asymptotic Analysis
Ignoring constants in T(n)Analyzing T(n) as n"gets large"
)T(n) = O(n3ly,Notational
"3nofordertheon"roughlygrowstimerunningThe
)(
log3
nT
nnn2nnn
dominatesitso
,and,,thanlargerMUCHislarger,growsAs
nnnnnnT 4log24213)( 23 Example:
The big-oh (O) Notation
8/13/2019 Algorithm Analysis Big Oh
3/15
Dale Roberts
3 major notations
(g(n)), Big-Oh of g of n, the Asymptotic Upper
Bound.(g(n)), Big-Omega of g of n, the Asymptotic
Lower Bound.
(g(n)), Big-Theta of g of n, the AsymptoticTight Bound.
8/13/2019 Algorithm Analysis Big Oh
4/15
Dale Roberts
Big-Oh Defined
The O symbol was introduced in 1927 to indicate relative growth of
two functions based on asymptotic behavior of the functions now
used to classify functions and families of functions
T(n) = O(f(n)) if th ere are cons tants c and n0 such that T(n) < c*f(n)
when n n0
c*f(n)
T(n)
n0 n
c*f(n)is an upper bound for T(n)
8/13/2019 Algorithm Analysis Big Oh
5/15
Dale Roberts
Big-Oh
Describes an upper bound for the running
time of an algorithm
Upper bounds for Insertion Sort running times:
worst case: O(n2) T(n) = c1*n2+ c2*n+ c3
best case: O(n) T(n) = c1*n+ c2
Time Complexity
8/13/2019 Algorithm Analysis Big Oh
6/15
Dale Roberts
Big-O Notation
We say Insertion Sorts run time is O(n2)
Properly we should say run time is inO(n2)
Read O as Big-Oh (youll also hear it as order)
In general a function
f(n) is O(g(n)) if there exist positive constants cand n0
such that f(n) cg(n) for all n n0
e.g. if f(n)=1000n and g(n)=n2, n0= 1000 andc =
1 then f(n) < 1*g(n) where n >n0and we say that
f(n) = O(g(n))The O notation indicates 'bounded above by a
cons tant m ult ip le of .'
8/13/2019 Algorithm Analysis Big Oh
7/15Dale Roberts
Big-Oh Properties
Fastest growing function dominates a sum
O(f(n)+g(n)) is O(max{f(n), g(n)})
Product of upper bounds is upper bound for the product
If f is O(g) and h is O(r) then fh is O(gr)
fis O(g) is transitiveIf f is O(g)and g is O(h) then f is O(h)
Hierarchy of functions
O(1), O(logn), O(n1/2), O(nlogn), O(n2), O(2n), O(n!)
8/13/2019 Algorithm Analysis Big Oh
8/15Dale Roberts
Some Big-Ohs are not reasonable
Polynomial Time algorithms
An algorithm is said to be polynomial if it isO( nc), c >1Polynomial algorithms are said to be reasonable
They solve problems in reasonable times!
Coefficients, constants or low-order terms are ignored
e.g. if f(n) = 2n2then f(n) = O(n2)
Exponential Time algorithms
An algorithm is said to be exponential if it isO( rn), r >1Exponential algorithms are said to be unreasonable
8/13/2019 Algorithm Analysis Big Oh
9/15Dale Roberts
Can we justify Big O notation?
Big O notation is a hugesimplification; can we
justify it?It only makes sense for largeproblem sizes
For sufficiently large problem sizes, thehighest-order term swamps all the rest!
ConsiderR = x2 3x 5as xvaries:x = 0 x2 = 0 3x = 10 5 = 5 R = 5
x = 10 x2 = 100 3x = 30 5 = 5 R = 135
x = 100 x2 = 10000 3x = 300 5 = 5 R = 10,305
x = 1000 x2 = 1000000 3x = 3000 5 = 5 R = 1,003,005
x = 10,000 R = 100,030,005
x = 100,000 R = 10,000,300,005
8/13/2019 Algorithm Analysis Big Oh
10/15Dale Roberts
Classifying Algorithms based on Big-Oh
A function f(n) is said to be of at most logarithmic growthif f(n) =O(log n)
A function f(n) is said to be of at most quadratic growthif f(n) =O(n2)
A function f(n) is said to be of at most polynomial growthif f(n) =O(nk), for some natural number k > 1
A function f(n) is said to be of at most exponential growthif there is
a constant c, such that f(n) = O(cn), and c > 1A function f(n) is said to be of at most factorial growthif f(n) = O(n!).
A function f(n) is said to have constant running time if the size ofthe input n has no effect on the running time of the algorithm (e.g.,assignment of a value to a variable). The equation for this algorithmis f(n) = c
Other logarithmic classifications: f(n) = O(n log n)
f(n) = O(log log n)
8/13/2019 Algorithm Analysis Big Oh
11/15Dale Roberts
Rules for Calculating Big-Oh
Base of Logs ignored
logan = O(logbn)
Power inside logs ignored
log(n2) = O(log n)
Base and powers in exponents not ignored3nis not O(2n)
2
a(n )
is not O(an
)If T(x) is a polynomial of degree n, then T(x) =
O(xn)
8/13/2019 Algorithm Analysis Big Oh
12/15Dale Roberts
Big-Oh Examples
1. 2n3+ 3n2+ n = 2n3+ 3n2+ O(n)
= 2n3+ O( n2+ n)
= 2n3+ O( n2)
= O(n3) = O(n4)
2. 2n3+ 3n2+ n = 2n3+ 3n2+ O(n)= 2n3+ O(n2+ n)
= 2n3+ O(n2) = O(n3)
8/13/2019 Algorithm Analysis Big Oh
13/15Dale Roberts
Big-Oh Examples (cont.)
3. Suppose a program P is O(n3), and a program Q
is O(3n), and that currently both can solveproblems of size 50 in 1 hour. If the programs
are run on another system that executes exactly
729 times as fast as the original system, what
size problems will they be able to solve?
8/13/2019 Algorithm Analysis Big Oh
14/15Dale Roberts
Big-Oh Examples (cont)
n3= 503*729 3n= 350*729
n = n = log3(729 *350)n = log3(729) + log33
50
n = 50 *9 n = 6 + log3350
n = 50 *9 = 450 n = 6 + 50 = 56
Improvement: problem size increased by 9 times for
n3algorithm but only a slight improvement in problem
size (+6) for exponential algorithm.
33 3
729*50
8/13/2019 Algorithm Analysis Big Oh
15/15Dale Roberts
Acknowledgements
Philadephia University, Jordan
Nilagupta, Pradondet