7/27/2019 2 Basics of Algorithm Analysis
1/32
Chapter 2
Basics of
Algorithm Analysis
7/27/2019 2 Basics of Algorithm Analysis
2/32
Computational Tractability
As soon as an Analytic Engine exists, it will
necessarily guide the future course of thescience. Whenever any result is sought by its
aid, the question will arise -
By what course of
calculation can these results be arrived at by themachine in the shortest time?
-
Charles Babbage
7/27/2019 2 Basics of Algorithm Analysis
3/32
Polynomial-Time
Brute force. For many non-trivial problems,
there is a natural brute force searchalgorithm that checks every possible
solution.
Typically takes 2N
time or worse for inputs of
size N.
n! for stable matching with n men and n women Unacceptable in practice.
7/27/2019 2 Basics of Algorithm Analysis
4/32
Polynomial-Time
Desirable scaling property. When the input
size doubles, the algorithm should onlyslow down by some constant factorC.
There exists constants c > 0 and d > 0 suchthat on every input of size N, its running timeis bounded by cNd
steps
Def. An algorithm is poly-time if the abovescaling property holds. choose c = 2
d
7/27/2019 2 Basics of Algorithm Analysis
5/32
7/27/2019 2 Basics of Algorithm Analysis
6/32
Worst-Case Analysis
Average case running time. Obtain bound
on running time of algorithm on randominput as a function of input size N.
Hard (or impossible) to accurately model realinstances by random distributions.
Algorithm tuned for a certain distribution may
perform poorly on other inputs.
7/27/2019 2 Basics of Algorithm Analysis
7/32
Worst-Case Polynomial-Time
Def. An algorithm is efficient if its running time is
polynomial. Justification: It really works in practice!
Although 6.021023
N20
is technically poly-time, it
would be useless in practice.
In practice, the poly-time algorithms that peopledevelop almost always have low constants and lowexponents.
Breaking through the exponential barrier of bruteforce typically exposes some crucial structure ofthe problem.
7/27/2019 2 Basics of Algorithm Analysis
8/32
Worst-Case Polynomial-Time
Exceptions.
Some poly-time algorithms do have highconstants and/or exponents, and are uselessin practice.
Some exponential-time (or worse) algorithmsare widely used because the worst-caseinstances seem to be rare.
simplex method
Unix grep
7/27/2019 2 Basics of Algorithm Analysis
9/32
Why It Matters?
7/27/2019 2 Basics of Algorithm Analysis
10/32
Asymptotic Order of Growth
Upper bounds. f(n) is O(g(n)) if there exist
positive constants c and n0
such that for alln n0
we have cg(n) f(n)
7/27/2019 2 Basics of Algorithm Analysis
11/32
Asymptotic Order of Growth
Lower bounds. f(n) is (g(n)) if there exist
positive constants c and n0
such that for alln n0
we have f(n) cg(n).
7/27/2019 2 Basics of Algorithm Analysis
12/32
Asymptotic Order of Growth
Tight bounds. f (n)
is (g(n))
if there exist
positive constants c1
, c2
, and n0
such that0 c1
g(n) f (n) c2 g(n) for all n n0
.
7/27/2019 2 Basics of Algorithm Analysis
13/32
Asymptotic Order of Growth
f(n) is (g(n)) if f(n) is both O(g(n)) and
(g(n)). Ex: T(n) = 32n
2
+ 17n + 32.
T(n) is O(n2
), O(n
3
),
(n
2
),
(n), and
(n
2
) . T(n) is not O(n), (n
3), (n), or(n3).
7/27/2019 2 Basics of Algorithm Analysis
14/32
Asymptotic Order of Growth
Not asymptotically tight upper bounds: f(n)
is o(g(n)) iffor all constants c > 0, thereexists a constant n0
> 0 such that 0 f (n)< cg(n) for all n n0 .
7/27/2019 2 Basics of Algorithm Analysis
15/32
Asymptotic Order of Growth
Not asymptotically tight lower bounds: f(n)
is (g(n)) if forall constants c > 0, thereexists a constant n0
> 0 such that 0 cg(n)
7/27/2019 2 Basics of Algorithm Analysis
16/32
Notation
Slight abuse of notation. f(n) = O(g(n)).
Asymmetric:
f(n) = 5n3; g(n) = 3n2
f(n) = O(n3) = g(n)
but f(n)
g(n).
Better notation: f(n) O(g(n)).
7/27/2019 2 Basics of Algorithm Analysis
17/32
Properties
Transitivity:
f (n) = (g(n)) and g(n) = (h(n)) => f (n) =(h(n)).
Same for O, , o, and .
Reflexivity: f (n) = ( f (n)). Same forO and . Symmetry:
f (n) = (g(n)) if and only if g(n) = ( f (n)).
7/27/2019 2 Basics of Algorithm Analysis
18/32
Properties
Transpose symmetry:
f (n) = O(g(n)) if and only if g(n) = ( f (n)). f (n) = o(g(n)) if and only if g(n) = ( f (n)). Additivity: If f = (h) and g = (h) then f + g = (h).
Same for O, , o, and .
7/27/2019 2 Basics of Algorithm Analysis
19/32
Asymptotic Bounds for Some
Common Functions Polynomials. a0
+ a1
n + + ad
nd
is (nd) if ad
> 0. Polynomial time. Running time is O(n
d) for someconstant d independent of the input size n.
Logarithms. O(loga n) = O(logb n) for anyconstants a, b > 0. Logarithms. For every x > 0, log n = O(n
x).
Exponentials. For every r > 1 and every d > 0, nd
= O(rn).
a
nn
b
b
a log
loglog =
7/27/2019 2 Basics of Algorithm Analysis
20/32
A Survey of Common RunningTimes
7/27/2019 2 Basics of Algorithm Analysis
21/32
Linear Time: O(n)
Linear time. Running time is at most a
constant factor times the size of the input. Computing the maximum. Compute
maximum of n numbers a1 , , an .
7/27/2019 2 Basics of Algorithm Analysis
22/32
Linear Time: O(n)
Merge. Combine two sorted lists A =
a1 ,a2
,,an
with B = b1
,b2
,,bn
into sortedwhole.
7/27/2019 2 Basics of Algorithm Analysis
23/32
Linear Time: O(n)
Claim. Merging two lists of size n takesO(n) time.
Pf. After each comparison, the length ofoutput list increases by 1.
7/27/2019 2 Basics of Algorithm Analysis
24/32
O(n log n) Time O(n log n) time. Arises in divide-and-conqueralgorithms. Sorting. Mergesort and heapsort are sortingalgorithms that perform O(n log n) comparisons.
Largest empty interval. Given n time-stamps x1
,
, xn
on which copies of a file arrive at a server,
what is largest interval of time when no copies ofthe file arrive?
O(n log n) solution. Sort the time-stamps. Scanthe sorted list in order, identifying the maximumgap between successive time-stamps.
7/27/2019 2 Basics of Algorithm Analysis
25/32
Quadratic Time: O(n2)
Quadratic time. Enumerate all pairs of
elements. Closest pair of points. Given a list of n
points in the plane (x1 , y1 ), ,(xn , yn ), findthe pair that is closest.
7/27/2019 2 Basics of Algorithm Analysis
26/32
Quadratic Time: O(n2)
O(n2) solution. Try all pairs of points.
Remark. (n2) seems inevitable, but this is justan illusion
7/27/2019 2 Basics of Algorithm Analysis
27/32
Cubic Time: O(n3)
Cubic time. Enumerate all triples of
elements. Set disjointness. Given n sets S1
, , Sn
each of which is a subset of 1, 2, , n, isthere some pair of these which aredisjoint?
7/27/2019 2 Basics of Algorithm Analysis
28/32
Cubic Time: O(n3)
O(n3) solution. For each pairs of sets,
determine if they are disjoint
7/27/2019 2 Basics of Algorithm Analysis
29/32
Polynomial Time: O(nk) Time
Independent set of size k. Given a graph,
are there k nodes such that no two arejoined by an edge?
k is a constant O(nk) solution. Enumerate all subsets of k
nodes
7/27/2019 2 Basics of Algorithm Analysis
30/32
Polynomial Time: O(nk) Time
Check whether S is an independent set =O(k
2
).
7/27/2019 2 Basics of Algorithm Analysis
31/32
Polynomial Time: O(nk) Time
Number of k element subsets =
O(k2 nk/k!)= O(nk). poly-time for k=17, but not practical
7/27/2019 2 Basics of Algorithm Analysis
32/32
Exponential Time
Independent set. Given a graph, what is
maximum size of an independent set? O(n
2
2n) solution. Enumerate all subsets.