Top Banner
Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1 Mar 2014
38

Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Mar 28, 2015

Download

Documents

Keyshawn Norred
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Linear network code for erasure broadcast channel with feedback

Presented by Kenneth ShumJoint work with Linyu Huang, Ho Yuet Kwan and

Albert Sung

1Mar 2014

Page 2: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Erasure broadcast channel

Mar 2014 2

Sourcenode Data Packets

P1, P2, …, PN

User 1 User 2 User 3 User K….

Broadcast

Want to send all source data packets to each user.

Each transmittedpacket is erased withcertain probability

Page 3: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Erasure broadcast channel with feedback

Mar 2014 3

Sourcenode

User 1 User 2 User 3 User K….

Users can send acknowledgements back to the source node.

Page 4: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Linear Network Code

• Source node broadcasts encoded packets. • A packet is considered as a vector over a finite

field F.• An encoded packet is obtained by taking linear

combination of the N source packets, with coefficients drawn from F.– The vector formed by the N coefficients is called

the encoding vector of the encoded packet.

Mar 2014 4

Page 5: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Erasure broadcast channel

Mar 2014 5

Sourcenode Data Packets

P1, P2, …, PN

User 1 User 2 User 3 User K….

BroadcastLinear combinationsof P1, P2, …, PN

The packet header contains the encoding vector of the encoded packet.

Page 6: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

The received packets are cached

Mar 2014 6

Sourcenode

The source packets can beinterpreted as the standard basise1, e2, … eN in vector space FN

User 1 User 2 User 3 User K….

v1, v2, … v’1, v’2, … v’’1, v’’2, …

Each user stores the received packets and the corresponding encoding vectors

Page 7: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Synopsis

• Objectives:– Minimize the completion time of each user.– Minimize encoding and decoding complexity.

• Decoding complexity can be reduced if the encoding vectors are sparse.– Apply some version of Gaussian elimination which

exploits sparsity.• The problem of generating sparse encoding

vector is related to some NP-complete problems.• Heuristic algorithms and comparison.

Mar 2014 7

Page 8: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Complexity Issues in Network Coding• Deciding whether there exists a linear network code with

prescribed alphabet size is NP-hard– Lehman and Lehman, Complexity classification of network

information flow problems, SODA, 2004.• The minimization of the number of encoding nodes is NP-

hard.– Langberg, Sprintson and Bruck, The encoding complexity of

network coding, Trans. IT 2006.– Langberg and Sprintson, On the hardness of approximating the

network coding capacity, Trans IT, 2011.• For noiseless broadcast channel, when the alphabet is

binary, the problem of minimizing the number of packet transmissions in the index coding problem is NP-hard.– El Rouayheb, Chaudhry and Sprintson, On the minimum number

of transmissions in single-hop wireless coding networks, ITW, 2007.

Mar 2014 8

Page 9: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Innovative Packet

• An encoded packet is said to be innovative to a user if the corresponding encoding vector is linearly independent with the encoding vectors received previously.

• If an encoded packet is innovative to all users, then we say that it is innovative.

• It is known that innovative packets always exist if the finite field size is larger than or equal to the number of users.– Keller, Drinea and Fragouli, Online broadcasting with

network coding, NetCod, 2008.9Mar 2014

Page 10: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Notation: Encoding matrix

Mar 2014 10

Sourcenode

User 1 User 2 User 3 User K….

C1C2 C3

The rows of matrix Ci are the encoding vectors of the received packets.

CK

The source packets areinterpreted as the standard basise1, e2, … eN in vector space FN

Page 11: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Given for all ’s, the set of all innovativeencoding vectors is defined as

11Mar 2014

The set of all innovative encoding vectors

Page 12: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Given an encoding vector ,the support of is defined as

The Hamming weight of is defined as the cardinality of . with Hamming weight is said to be -sparse.

12Mar 2014

Hamming weight and sparsity

Page 13: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

SPASITY Problem

Consider both sparsity and innovativeness of anencoding vector, formulate the problem below:

Problem : SPARSITYInstance : K matrices over GF(q), where . is a positive integer. Question : Is there a vector withHamming weight less than or equal to ?

13Mar 2014

Page 14: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Example: Let q=2, K=2, N=4 and n=2. Consider the following two matrices

We have

There are three vectors in with Hammingweight less than or equal to n=2.

14Mar 2014

Page 15: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Theorem. SPARSITY is NP-complete.

Now define the optimization version of as follows:Question: Find a vector with minimum Hamming weight.

It can be shown that the optimization version ofSPARSITY is NP-hard.

However, for fixed K and q, by brute force methods, it can be solved in

15Mar 2014

Page 16: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Let be the row space of . Denote the orthogonal complement of by

Let be an matrix whose rows form a basis of . can be obtained by the Reduced Row Echelon Form(RREF) of .

16Mar 2014

Orthogonal complement

Page 17: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

To check whether an encoding vector is innovative, we use the following fact.

Theorem. Given , an encoding vector belongs to iff for all ’s .

17Mar 2014

Page 18: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Minimizing the Hamming Weight

Given all ’s, we have their by RREF.Let be the i-th row of . Define

where denotes the logical-OR operator applied component-wise to vectors witheach non-zero component being treated as a “1”.

18Mar 2014

Page 19: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Example:Let q=3, K=3, N=4 and the orthogonalcomplements of be given by the row spaces of

The vector for , are

19Mar 2014

Page 20: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Define as the matrix whose k-th row is . Note that is a binary matrix andhas no zero rows.

Given a subset of column indices of , let be the submatrix of matrix ,whose columns are chosen according to .

20Mar 2014

Page 21: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Lemma 3.Let be an index set and .There exists an encoding vector with support inside (i.e. for )iff has no zero rows.

21Mar 2014

Page 22: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Example (cont’d)

Mar 2014 22

First user

Second user

Third user

Choose a 3 x w submatrix of B with minimal w, such that the submatrix has no zero rows.

We may choose the first two columns. We can find an encoding vector with two non-zero components.

Page 23: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

By reducing HITTING SET to SPARSITY, NP-completeness of SPARSITY can be shown.

Problem: HITTING SETInstance: A finite set , a collection of

subsets of and an integer .Question: Is there a subset with

cardinality , such that for each we have ?

23Mar 2014

Page 24: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Example (cont’d)

Mar 2014 24

First user

Second user

Third user

Choose a 3 x w submatrix of B with minimal w, such that the submatrix has no zero rows.

1,4

First user

Second user

Third user

2

3

The minimal hitting sets are:{1,2}, {1,3}, {2,3}, {2,4}, {3,4}

Page 25: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Optimal Hitting method

• Solve the hitting set problem optimally by reducing it to binary integer programming.– Minimum sparsity at each iteration is guaranteed.

• After the support of the encoding vector is determined, find the coefficients which make the vector innovative.

Mar 2014 25

Page 26: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Greedy Hitting method

• Solve the hitting set problem heuristically by greedy method.– Sequentially pick an element which hits the largest

number of sets.– Minimum sparsity is not guaranteed.

• After the support of the encoding vector is determined, find the coefficients which make the vector innovative.

Mar 2014 26

Page 27: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Existing encoding schemes (I)

• Random linear network codes.– Encoding

• Phase 1: The source node first broadcast each packet.• Phase 2: Sends encoded packets with coeff. randomly

generated.

– Decode by Gaussian elimination.– No feedback is required.

Mar 2014 27

Page 28: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Existing encoding schemes (II)

• Chunked code– an extension of random linear network coding.– Divide the source packets into chunks. Each chunk

contains c packets.– Apply random linear network coding to each

chunk.– The resulting encoding vectors are c-sparse.– Feedback is not required.

Mar 2014 28

Page 29: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Existing encoding schemes (III)

• Instantly decodable network code– Encoding

• Phase 1: The source packets are first broadcast once.• Phase 2: Find a subset of users such that each of them

can decode a source packet by transmitting an encoded packet.

– Decoding: The user in the target set can decode one packet immediately if the encoded packet is received successfully.

– Feedback is required.Mar 2014 29

Page 30: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Existing encoding schemes (IV)

• LT code– Use the robust soliton degree distribution in

encoding– No feedback is required.

Mar 2014 30

Page 31: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Comparison of complexity

Mar 2014 31

Scheme Encoding Decoding

LT code O(N) O(N2)

Random linear network code

O(N) O(N3)

Chunked code O(c ) O(c2 N)

Instantly decodable network code

O(K3N2) O( min(K,N) N)

Optimal hitting method

O(1.238N+K) O( min(K,N) N2)

Greed hitting method

O(K2 N2) O( min(K,N) N2)

Page 32: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Completion time vs number of users(perfect feedback)

Mar 2014 32

Page 33: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Binary alphabet

Mar 2014 33

Page 34: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Completion time vs number of users(lossy feedback)

Mar 2014 34

Page 35: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Decoding time vs no. of users

Mar 2014 35

Page 36: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Encoding time vs no. of users

Mar 2014 36

Page 37: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Hamming weight vs no. of users

Mar 2014 37

Page 38: Linear network code for erasure broadcast channel with feedback Presented by Kenneth Shum Joint work with Linyu Huang, Ho Yuet Kwan and Albert Sung 1Mar.

Conclusion• We investigate the issue of the generation of

sparsest innovative encoding vectors which is proven to be NP-hard.

• A systematic way to generate the sparsest innovative encoding vectors is given.

• There is a tradeoff between encoding complexity, decoding complexity, and completion time.

38Mar 2014