Top Banner
Network Coding: A New Direction in Combinatorial Optimization Nick Harvey
71

Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Jan 03, 2016

Download

Documents

Julius Elliott
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Network Coding:A New Direction in

Combinatorial Optimization

Nick Harvey

Page 2: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Collaborators

David Karger Robert Kleinberg April Rasala Lehman

Kazuo Murota

Kamal Jain

Micah Adler

UMass

Page 3: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Transportation Problems

Max Flow

Page 4: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Transportation Problems

Min Cut

Page 5: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Communication Problems“A problem of inherent interest in the planning of large-scale communication, distribution and transportation networks also arises with the current rate structure for Bell System leased-line services.”

- Robert Prim, 1957Spanning Tree

Steiner Tree

Facility Location

Steiner Forest

Steiner Network

Multicommodity Buy-at-Bulk

Motivation for Network Design largely from communication networks

Page 6: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

s1 s2

Send items from s1t1 and s2t2

Problem: no disjoint paths

bottleneck edge

What is the capacity of a network?

t2 t1

Page 7: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

b1⊕b2

An Information Networkb1 b2s1 s2

t2 t1

If sending information, we can do better Send xor b1⊕b2 on bottleneck edge

Page 8: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Moral of Butterfly

Transportation Network Capacity≠

Information Network Capacity

Page 9: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Information TheoryDeep analysis of simple channels

(noise, interference, etc.)Little understanding of network structures

Combinatorial OptimizationDeep understanding of transportation

problems on complex structuresDoes not address information flow

Network CodingCombine ideas from both fields

Understanding Network Capacity

Page 10: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Definition: Instance Graph G (directed or undirected)

Capacity ce on edge e k commodities, with

A source si

Set of sinks Ti

Demand di

Typically:all capacities ce = 1

all demands di = 1

s1 s2

t2 t1

Technicality: Always assume G is directed.

Replace with

Page 11: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Definition: Solution Alphabet (e) for messages on edge e A function fe for each edge s.t.

Causality: Edge (u,v) sendsinformation previously received at u.

Correctness: Each sink ti can decodedata from source si.

b1⊕b2

b1 b2

b1⊕b2b2

b1

Page 12: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Multicast

Page 13: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Multicast Graph is DAG 1 source, k sinks Source has r messages in

alphabet Each sink wants all msgs

m1 m2 mr…

Source:

Sinks:

Thm [ACLY00]: Network coding solution exists iff connectivity r from source to each sink

Page 14: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Multicast Example

t1 t2

sm1 m2

Page 15: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Linear Network Codes Treat alphabet as finite field Node outputs linear

combinations of inputs

Thm [LYC03]: Linear codes sufficient for multicast

A B

A+B A+B

Page 16: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Multicast Code Construction Thm [HKMK03]: Random linear codes

work (over large enough field)

Thm [JS…03]: Deterministic algorithm to construct codes

Thm [HKM05]: Deterministic algorithm to construct codes (general algebraic approach)

Page 17: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Random Coding Solution

Randomly choose coding coefficients Sink receives linear comb of source msgs If connectivity r, linear combs

have full rank

can decode!

Without coding, problem isSteiner Tree Packing (hard!)

Page 18: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Our Algorithm Derandomization of [HKMK] algorithm Technique: Max-Rank Completion

of Mixed Matrices Mixed Matrix: contains numbers and variables Completion = choice of values for variables that

maximizes the rank.

Page 19: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

k-Pairs Problemsaka “Multiple Unicast Sessions”

Page 20: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

k-pairs problem Network coding when each commodity has

one sinkAnalogous to multicommodity flow

Goal: compute max concurrent rateThis is an open question

s1 s2

t2 t1

Page 21: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Rate Each edge has its own alphabet (e) of

messages Rate = min log( (S(i)) )

NCR = sup { rate of coding solutions }

Observation: If there is a fractional flow with rational coefficients achieving rate r, there is a network coding solution achieving rate r.

Source S(i)Edge e log( (e) )

Page 22: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Network coding rate can be muchlarger than flow rate!

Butterfly graph Network coding rate (NCR) = 1 Flow rate = ½

Thm [HKL’04,LL’04]: graphs G(V,E) whereNCR = Ω( flow rate ∙ |V| )

Thm [HKL’05]: graphs G(V,E) whereNCR = Ω( flow rate ∙ |E| )

Directed k-pairss1 s2

t2 t1

Page 23: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

NCR / Flow Gaps1 s2

t1 t2

G (1):

Equivalent to:

s1s2

t1 t2

Edge capacity= 1

s1s2

t1 t2

Edge capacity = ½

Network Coding Flow

NCR = 1Flow rate = ½

Page 24: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

NCR / Flow Gaps1 s2 s3 s4

t1 t2 t3 t4

G (2):

Start with two copies of G (1)

Page 25: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

NCR / Flow Gaps1 s2 s3 s4

t1 t2 t3 t4

G (2):

Replace middle edges with copy of G (1)

Page 26: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

NCR / Flow Gaps1 s2 s3 s4

G (1)

t1 t2 t3 t4

G (2):

NCR = 1, Flow rate = ¼

Page 27: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

NCR / Flow Gap

G (n-1)G (n):

# commodities = 2n, |V| = O(2n), |E| = O(2n) NCR = 1, Flow rate = 2-n

s1 s2

t1 t2

s3 s4

t3 t4

s2n-1 s2n

t2n-1 t2n

Page 28: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Optimality

The graph G (n) proves:Thm [HKL’05]: graphs G(V,E) whereNCR = Ω( flow rate ∙ |E| )

G (n) is optimal:Thm [HKL’05]: graph G(V,E),NCR/flow rate = O(min {|V|,|E|,k})

Page 29: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Network flow vs. information flowMulticommodity

Flow Efficient algorithms for

computing maximum concurrent (fractional) flow.

Connected with metric embeddings via LP duality.

Approximate max-flow min-cut theorems.

NetworkCoding

Computing the max concurrent network coding rate may be: Undecidable Decidable in poly-time

No adequate duality theory.

No cut-based parameter is known to give sublinear approximation in digraphs.

No known undirected instance where network coding rate ≠ max flow!(The undirected k-pairs conjecture.)

Page 30: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Why not obviously decidable? How large should alphabet size be? Thm [LL05]: There exist networks where

max-rate solution requires alphabet size

Moreover, rate does not increase monotonically with alphabet size! No such thing as a “large enough” alphabet

3/12

2n

Page 31: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Approximate max-flow / min-cut?

The value of the sparsest cut is a O(log n)-approximation to

max-flow in undirected graphs. [AR’98, LLR’95, LR’99]

a O(√n)-approximation tomax-flow in directed graphs. [CKR’01, G’03, HR’05]

not even a valid upper bound on network coding rate in directed graphs!

s1 s2

t2 t1

e

{e} has capacity 1 and separates 2 commodities, i.e. sparsity is ½.

Yet network coding rate is 1.

Page 32: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Approximate max-flow / min-cut? The value of the sparsest cut

induced by a vertex partition is a valid upper bound, but can exceed network coding rate by a factor of Ω(n).

We next present a cut parameter which may be a better approximation…

si

ti

sj

tj

Page 33: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Definition: A e if for every network coding solution, the messages sent on edges of A uniquely determine the message sent on e.

Given A and e, how hard is it to determine whether A e? Is it even decidable?

Theorem [HKL’05]: There is a combinatorial characterization of informational dominance. Also, there is an algorithm to compute whetherA e in time O(k²m).

Informational Dominance

i

i

i

Page 34: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

s1 s2

t2 t1

A does not dominate B

Informational Dominance

Def: A dominates B if information in A determines information in Bin every network coding solution.

Page 35: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Informational Dominance

Def: A dominates B if information in A determines information in Bin every network coding solution.

s1 s2

t2 t1

A dominates B

Sufficient Condition: If no path fromany source B then A dominates B

(not a necessary condition)

Page 36: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Informational Dominance Examples1 s2

t1

t2

“Obviously” flow rate = NCR = 1 How to prove it? Markovicity? No two edges disconnect t1 and t2 from both sources!

Page 37: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Informational Dominance Examples1 s2

t1

t2

Our characterization implies thatA dominates {t1,t2} H(A) H(t1,t2)

Cut A

Page 38: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Informational Meagerness Def: Edge set A informationally isolates

commodity set P if A υ P P.

iM (G) = minA,P

for P informationally isolated by A

Claim: network coding rate iM (G).

iCapacity of edges in A

Demand of commodities in P

Page 39: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Approximate max-flow / min-cut? Informational meagerness is no better than an

Ω(log n)-approximation to the network coding rate, due to a family of instances called the iterated split butterfly.

Page 40: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Approximate max-flow / min-cut? Informational meagerness is no better than a

Ω(log n)-approximation to the network coding rate, due to a family of instances called the iterated split butterfly.

On the other hand, we don’t even know if it is a o(n)-approximation in general.

And we don’t know if there is a polynomial-time algorithm to compute a o(n)-approximation to the network coding rate in directed graphs.

Page 41: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Sparsity Summary Directed Graphs

Undirected Graphs

Flow Rate Sparsity < NCR iM (G)

in some graphs

Flow Rate NCR Sparsity

easy consequence of info. dom.Gap can be Ω(log n) when G is an expander

Page 42: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Undirected k-Pairs Conjecture

Flow Rate Sparsity NCR< =? ?= <? ?

Undirected k-pairs conjecture

Unknown until this work

Page 43: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

The Okamura-Seymour Graph

s1

t1s2

t2s3

t3

s4 t4

Every edge cut has enough capacity to carry the combined demand of all commodities separated by the cut.

Cut

Page 44: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Okamura-Seymour Max-Flow

s1

t1s2

t2s3

t3

s4 t4

Flow Rate = 3/4

si is 2 hops from ti.

At flow rate r, eachcommodity consumes 2r units of bandwidth in a graph with only 6 units of capacity.

Page 45: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

The trouble with information flow… If an edge combines

messages from multiple sources, which commodities get charged for “consuming bandwidth”?

We present a way around this obstacle and boundNCR by 3/4.

s1

t1s2

t2s3

t3

s4 t4

At flow rate r, each commodity consumes at least 2r units of bandwidth in a graph with only 6 units of capacity.

Page 46: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Thm [AHJKL’05]: flow rate = NCR = 3/4.

We will prove:

Thm [HKL’05]: NCR 6/7 < Sparsity. Proof uses properties of entropy.

A B H(A) H(B)Submodularity: H(A)+H(B) H(AB)+H(AB)

Lemma (Cut Bound): For a cut A E,H( A ) H( A, sources separated by A ).

Okamura-Seymour Proof

Page 47: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

s1

t1s2

t2s3

t3

s4 t4

H(A) H(A,s1,s2,s4) (Cut Bound)

Cut A

Page 48: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

s1

t1s2

t2s3

t3

s4 t4

H(B) H(B,s1,s2,s4) (Cut Bound)

Cut B

Page 49: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Add inequalities:H(A) + H(B) H(A,s1,s2,s4) + H(B,s1,s2,s4)

Apply submodularity:H(A) + H(B) H(AB,s1,s2,s4) + H(s1,s2,s4)

Note: AB separates s3 (Cut Bound)

H(AB,s1,s2,s4) H(s1,s2,s3,s4)

Conclude:H(A) + H(B) H(s1,s2,s3,s4) + H(s1,s2,s4)6 edges rate of 7 sources rate 6/7.

Cut ACut B

Page 50: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Rate ¾ for Okamura-Seymour

s1 t3

s2 t1

s3 t2

s4 t4

s1

i

s1 t3

s3

Page 51: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Rate ¾ for Okamura-Seymour

s1 t3

s2 t1

s3 t2

s4 t4

i

i

i + +

≥ + +

Page 52: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Rate ¾ for Okamura-Seymour

s1 t3

s2 t1

s3 t2

s4 t4

i

i

i + +

≥ + +

Page 53: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Rate ¾ for Okamura-Seymour

s1 t3

s2 t1

s3 t2

s4 t4

i

i

i + +

≥ + +i

Page 54: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Rate ¾ for Okamura-Seymour

s1 t3

s2 t1

s3 t2

s4 t4

+ + ≥ +

i i

Page 55: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Rate ¾ for Okamura-Seymour

s1 t3

s2 t1

s3 t2

s4 t4

+ + ≥ +

3 H(source) + 6 H(undirected edge) ≥ 11 H(source)6 H(undirected edge) ≥ 8 H(source)

¾ ≥ RATE

Page 56: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Special Bipartite Graphs

s1 t3

s2 t1

s3 t2

s4 t4

This proof generalizes to

show that max-flow = NCR

for every instance which is: Bipartite Every source is 2 hops away from its sink. Dual of flow LP is optimized by assigning

length 1 to all edges.

Page 57: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

The k-pairs conjecture and I/O complexity

In the I/O complexity model [AV’88], one has:A large, slow external memory consisting of

pages each containing p records.A fast internal memory that holds O(1) pages.

(For concreteness, say 2.)Basic I/O operation: read in two pages from

external memory, write out one page.

Page 58: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

I/O Complexity of Matrix Transposition

Matrix transposition: Given a p×p matrix of records in row-major order, write it out in column-major order.

Obvious algorithm requires O(p²) ops.

A better algorithm uses O(p log p) ops.

Page 59: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

I/O Complexity of Matrix Transposition

Matrix transposition: Given a p×p matrix of records in row-major order, write it out in column-major order.

Obvious algorithm requires O(p²) ops.

A better algorithm uses O(p log p) ops.

s1 s2

Page 60: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

I/O Complexity of Matrix Transposition

Matrix transposition: Given a pxp matrix of records in row-major order, write it out in column-major order.

Obvious algorithm requires O(p²) ops.

A better algorithm uses O(p log p) ops.

s1 s2 s3 s4

Page 61: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

I/O Complexity of Matrix Transposition

Matrix transposition: Given a pxp matrix of records in row-major order, write it out in column-major order.

Obvious algorithm requires O(p²) ops.

A better algorithm uses O(p log p) ops.

s1 s2 s3 s4

t1t3

Page 62: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

I/O Complexity of Matrix Transposition

Matrix transposition: Given a pxp matrix of records in row-major order, write it out in column-major order.

Obvious algorithm requires O(p²) ops.

A better algorithm uses O(p log p) ops.

s1 s2 s3 s4

t1t3t2

t4

Page 63: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

I/O Complexity of Matrix Transposition

Theorem: (Floyd ’72, AV’88) If a matrix transposition algorithm performs only read and write operations (no bitwise operations on records) then it must perform Ω(p log p) I/O operations.

s1 s2 s3 s4

t1t3t2

t4

Page 64: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

I/O Complexity of Matrix Transposition

Proof: Let Nij denote the number of ops in which record (i,j) is written. For all j,

Σi Nij ≥ p log p.

Hence

Σij Nij ≥ p² log p.

Each I/O writes only p records. QED.

s1 s2 s3 s4

t1t3t2

t4

Page 65: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

The k-pairs conjecture and I/O complexity

Definition: An oblivious algorithm is one whose pattern of read/write operations does not depend on the input.

Theorem: If there is an oblivious algorithm for matrix transposition using o(p log p) I/O ops, the undirected k-pairs conjecture is false.

s1 s2 s3 s4

t1t3t2

t4

Page 66: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

The k-pairs conjecture and I/O complexity

Proof: Represent the algorithm

with a diagram as before.

Assume WLOG that each node has only two outgoing edges.

s1 s2 s3 s4

t1t3t2

t4

p1 p2

p1 q p2

Page 67: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

The k-pairs conjecture and I/O complexity

Proof: Represent the algorithm

with a diagram as before.

Assume WLOG that each node has only two outgoing edges.

Make all edges undirected, capacity p.

Create a commodity for each matrix entry.

s1 s2 s3 s4

t1t3t2

t4

p1 p2

p1 q p2

Page 68: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

The k-pairs conjecture and I/O complexity

Proof: The algorithm itself is a

network code of rate 1. Assuming the k-pairs

conjecture, there is a flow of rate 1.

Σi,jd(si,tj) ≤ p |E(G)|.

Arguing as before, LHS is Ω(p² log p).

Hence |E(G)|=Ω(p log p).

s1 s2 s3 s4

t1t3t2

t4

p1 p2

p1 q p2

Page 69: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Other consequences for complexity

The undirected k-pairs conjecture implies:A Ω(p log p) lower bound for matrix

transposition in the cell-probe model.

[Same proof.]A Ω(p² log p) lower bound for the running time

of oblivious matrix transposition algorithms on a multi-tape Turing machine.

[I/O model can emulate multi-tape Turing machines with a factor p speedup.]

Page 70: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Open Problems Computing the network coding rate in DAGs:

Recursively decidable? How do you compute a o(n)-factor approximation?

Undirected k-pairs conjecture:Does flow rate = NCR?At least prove a Ω(log n) gap between

sparsest cut and network coding rate for some graphs.

Page 71: Network Coding: A New Direction in Combinatorial Optimization Nick Harvey.

Summary Information ≠ Transportation For multicast, NCR rate = min cut

Algorithms to find solution k-pairs:

Directed: NCR >> flow rateUndirected: Flow rate = NCR in O-S graph

Informational dominance