Analysis of Algorithms (pt 2) (Chapter 4) COMP53 Oct 3, 2007
Dec 21, 2015
Analysis of Algorithms(pt 2)
(Chapter 4)
COMP53Oct 3, 2007
Best, Worst and Average Case• For a particular problem size
n, we can find:• Best case: the input that can
be solved the fastest• Worst case: the input that
will take the longest• Average case: average time
for all inputs of the same size. 0
20
40
60
80
100
120
Runnin
g T
ime
1000 2000 3000 4000
Input Size
best caseaverage caseworst case
Methods of Analysis
• Experimental Studies: Run experiments on implementations of algorithms and record actual time or operation counts.
• Theoretical Analysis: Determine time or operation counts from mathematical analysis of the algorithm– doesn’t require an implementation
(or even a computer)
Theoretical Analysis
• Uses a high-level description of the algorithm instead of an implementation
• Characterizes running time as a function of the input size, n.
• Takes into account all possible inputs• Allows us to evaluate the speed of an
algorithm independent of the hardware/software environment
Seven Important Functions• Seven functions that often appear in algorithm
analysis:– Constant 1– Logarithmic log n– Linear n– N-Log-N n log n– Quadratic n2
– Cubic n3
– Exponential 2n
• In a log-log chart, the slope of the line corresponds to the growth rate of the function
Growth Functions
Reasonable Time• Assume GHz machine: 106 operations/second
• Clearly, any algorithm requiring more than 1014 operations is impractical
Minute Hour Day Month Year108 ops 109 ops 1011 ops 1012 ops 1013 ops
Growth Functions
< day < month > year
Big-Oh Notation
f(n) is O(g(n)) if there are positive constants c and n0 such that
f(n) cg(n) for n n0
Example:
2n 10 є O(n)
2n 10 cn
for c 3 and n >= n010
1
10
100
1,000
10,000
1 10 100 1,000n
3n
2n+10
n
Big-Oh Example
Example: n2 is not O(n)
There is no value of c such that n2 cn, as n ∞
1
10
100
1,000
10,000
100,000
1,000,000
1 10 100 1,000n
n 2̂
100n
10n
n
Comparing Growth Rates
• f(n) є O(g(n)) means that the growth rate of f(n) is no more than the growth rate of g(n)
• g(n) is an upper bound on the value of f(n)
Asymptotic Algorithm Analysis• The asymptotic analysis of an algorithm
determines the running time in big-Oh notation• To perform the asymptotic analysis
– We find the worst-case number of basic operations as a function of the input size
– We express this function with big-Oh notation• Example:
– We determine that algorithm arrayMax executes at most 8n 2 primitive operations
– We say that algorithm arrayMax “runs in O(n) time”
Example: Prefix Averages• The i-th prefix average
of an array X is average of the first (i 1) elements of X:
A[i] X[0] X[1] … X[i])/(i+1)
• This has applications to financial analysis
0
5
10
15
20
25
30
35
1 2 3 4 5 6 7
X
A
Prefix Averages: Algorithm 1Algorithm prefixAverages1(X, n)
Input array X of n integersOutput array A of prefix averages of X A new array of n integersfor i 0 to n 1 do
s X[0] for j 1 to i do
s s X[j]A[i] s (i 1)
return A
basic operation: addition
Algorithm 1 Analysis
• Number of additions is 1 2 … n-1
• The sum of the first n-1 integers is (n-1)n 2
• Algorithm prefixAverages1 є O(n2)
for i 0 to n 1 dofor j 1 to i do
s s X[j]
Prefix Averages (Algorithm 2)
Algorithm prefixAverages2(X, n)Input array X of n integersOutput array A of prefix averages of XA new array of n integerss 0 for i 0 to n 1 do
s s X[i]A[i] s (i 1)
return A
basic operation: addition
Algorithm 2 Analysis
• Number of additions is 1 1 … 1 = n
• Algorithm prefixAverages2 є O(n)
for i 0 to n 1 dos s + X[i]
Relatives of Big-Oh
• big-Omegaf(n) є (g(n))
if f(n) c•g(n) for n n0
• big-Thetaf(n) є (g(n))
if c’•g(n) f(n) c’’•g(n) for n n0
Intuition for Asymptotic Notationf(n) є O(g(n))
f(n) is asymptotically smaller or equal to g(n)
f(n) є (g(n)) f(n) is asymptotically bigger or equal to g(n)
f(n) є (g(n)) f(n) is asymptotically the same to g(n)
Big-Oh
Big-Omega
Big-Theta
Example Uses and
5n2 is (n2)
5n2 is (n)
5n2 is (n2)