Cooperative Techniques in Networks Abdellatif Zaidi Universit´ e Paris-Est Marne La Vall´ ee, France abdellatif.zaidi@univ-mlv.fr Spring School on 5G Communications Hammamet, March 2014 Abdellatif Zaidi Cooperative Techniques in Networks 1 / 90
Cooperative Techniques in Networks
Abdellatif Zaidi
Universite Paris-Est Marne La Vallee, France
abdellatif.zaidi@univ-mlv.fr
Spring School on 5G CommunicationsHammamet, March 2014
Abdellatif Zaidi Cooperative Techniques in Networks 1 / 90
Tutorial Goals
Review and discuss some important aspects of cooperative communications innetworks
Provide intuition for basic concepts
Connect with recent results
Encourage research activity
Contribute to networking practice
Abdellatif Zaidi Cooperative Techniques in Networks 2 / 90
Recurring Themes and Take-Aways
1 Many approaches, benefits, and challenges to utilizing relaying and cooperativecommunications in networks.
2 Relaying also includes, or should also include, multihop (store-and-forward routing)and network coding.
3 The capacity of relay systems is difficult to analyze, but the theory is surprisinglyflexible and diverse.
4 Although generalizations to networks with many nodes are in general not easy,there are schemes which scale appropriately.
5 In networks with interference, relaying can help, not only by adding power/energyspatially, but also by allowing distributed interference cancellation; and in generalthis boots network capacity.
6 To cope with network interference efficiently, classic relaying schemes as such ingeneral do not suffice, and need be combined carefully with other appropriatetechniques.
Abdellatif Zaidi Cooperative Techniques in Networks 3 / 90
Caveats
1 I assume familiarity with: entropy, mutual information, capacity of discretememoryless and additive white Gaussian noise channels, rate-distortion theory
2 References are not provided in the main slides; only selected references are given atthe end. These, along with the references therein, point to many other recentpapers on relaying and cooperative communications in networks.
3 Time constraints limit the scope to less than originally planned.
4 I may go fast. Please feel free to stop me, whenever you want to.
Abdellatif Zaidi Cooperative Techniques in Networks 4 / 90
Tutorial Outline
Part 1: Basics on Cooperation
Part 2: Cooperation in Presence of Interference
Part 3: Interaction and Computation
Abdellatif Zaidi Cooperative Techniques in Networks 5 / 90
Basics on Cooperation
Part I : Basics on Cooperation
Abdellatif Zaidi Cooperative Techniques in Networks 6 / 90
Basics on Cooperation
Outline: Part 1
1 Introduction and Models
2 Protocols
Amplify-and-Forward (AF)Decode-and-Forward (DF)Compress-and-Forward (CF)
3 Information Rates
Abdellatif Zaidi Cooperative Techniques in Networks 7 / 90
Basics on Cooperation Introduction and Models
Wireless Network
A communication network has devices andchannels
Network purpose: enable message exchangebetween nodes
Main features of wireless networks:
- Fading: electromagnetic scattering,
absorption, node mobility, humidity
- Broadcasting: nodes overhear wireless
transmissions (creates interference)
W1 (X1,Y1)
WN (XN,YN)
p(y1, . . . ,yN|x1, . . . ,xN)
Wk (Xk,Yk)
Abdellatif Zaidi Cooperative Techniques in Networks 8 / 90
Basics on Cooperation Introduction and Models
Fast and Slow Fading
Space-time models for huv:
- Deterministic: electromagnetic wave propagation equations
- Random: {huv,i}ni=1 is a realization of an integer-time stochasticprocess {Huv,i}ni=1(The random model admits uncertainty and is simpler).
Marginal distributions of the Huv,i:
- Assume the Huv,i, i = 1, . . . , n, have the same marginal distribution
Huv during a given communication session
- No fading: Huv is a known constant
- Rayleigh fading: Huv is complex, Gaussian, 0-mean, unit var.
Temporal correlation: two extremes
- Fast fading: huv,i are independent realizations of Huv
- Slow fading: Huv,i = Huv for all i
Abdellatif Zaidi Cooperative Techniques in Networks 9 / 90
Basics on Cooperation Introduction and Models
Discrete Memoryless Network Model
There are M source messages Wm,m = 1, 2, . . . ,M .
Message Wm is estimated as message Wm(u)at certain nodes u
W1 (X1,Y1)
WN (XN,YN)
p(y1, . . . ,yN|x1, . . . ,xN)
Wk (Xk,Yk)
Source model:
- Messages are statistically independent
- Sources are not bursty
Device model:
- Node u has one input variable Xu and one output variable Yu
- Causality: Xu,i = fu,i(local messages, Yi1u ), i = 1, . . . , n
- Cost constraint example: E[f(Xu,1 . . . , Xu,n)] Pu
Abdellatif Zaidi Cooperative Techniques in Networks 10 / 90
Basics on Cooperation Introduction and Models
Discrete Memoryless Network Model (Cont.)
W1 (X1,Y1)
WN (XN,YN)
p(y1, . . . ,yN|x1, . . . ,xN)
Wk (Xk,Yk)
Channel model:
- A network clock governs operations: node u transmits Xu,i betweenclock tick (i 1) and tick i, and receives Yu,i at tick i
- Memoryless: Yu,i generated by Xu,i, all u, via the channel
PY1Y2Y3...|X1X2X3...()
Capacity region: closure of set of all (R1, R2, . . . , RM ) for which
Pr[m,u
{Wm 6= Wm
}]can be made close to zero for large n
Abdellatif Zaidi Cooperative Techniques in Networks 11 / 90
Basics on Cooperation Introduction and Models
Node Coding/Broadcasting
Traditional approach:
- Channels treated as point-to-point links
- Data packets traverse paths (sequences of nodes)
Other possibilities:
Broadcasting: nodes overhear wireless transmissionsNode coding: nodes process
- reliable message or packet bits (network coding)- reliable or unreliable symbols (relaying/cooperation)
These concepts already appear in 3-node networks
Abdellatif Zaidi Cooperative Techniques in Networks 12 / 90
Basics on Cooperation Protocols
Building Block
1
2
3 WW
Direct transmission from Node 1 to Node 3
Multihop transmission through Node 2
Amplify-and-Forward (AF)Decode-and-Forward (DF)Compress-and-Forward (CF)
Capacity still unknown
Multiaccess problem: from Nodes 1 and 2 to Node 3Broadcast problem: from Node 1 to Nodes 2 and 3Feedback problem: output at Node 2 is a form of feedback
Superposition coding, binning, feedback techniques, etc.
Abdellatif Zaidi Cooperative Techniques in Networks 13 / 90
Basics on Cooperation Protocols
Direct Transmission
Encoder Decoder WXn1 Y
n3
Xn2 = bn
PY3|X1,X2(y3|x1, x2)W
Relay does not participate, i.e., X2,i = b (often 0) for all i
A standard random coding argument shows that rates
R < Rdir := maxPX1|X2 (|b),b
I(X1;Y3|X2 = b)
are achievable
Rdir is in fact capacity if the relay channel is reversely degraded, i.e.,
PY2Y3|X1X2() = PY3|X1X2()PY2|Y3()
(Think: Y2 is a noisy version of Y3)
Abdellatif Zaidi Cooperative Techniques in Networks 14 / 90
Basics on Cooperation Protocols
Multihop Transmission
Node 1 transmits to Node 2, and Node 2 transmits to Node 3
Motivated by a cascade of two channels, i.e.,
PY n2 Y n3 |Xn1 Xn2 () = PY n2 |Xn1 ()PY n3 |Xn2 ()
Question: What should Node 2 transmit ?
Non-Regenerative / Amplify-and-Forward (AF)Node 2 sets X2,i = Y2,ik for some k 1Compress-and-Forward (CF) / Estimate-Forward (EF)Node 2 conveys a quantized version of Y n2 to Node 3Regenerative / Decode-and-Forward (DF)Node 2 decodes W and re-encodes into a codeword Xn2
Transmissions usually occur in a pipeline of two (or more) blocks (potentially ofvarying sizes)
Abdellatif Zaidi Cooperative Techniques in Networks 15 / 90
Basics on Cooperation Protocols
Block Pipeline
Relay
Destination
Source
Block BBlock B-1Block 2
R R
y2[B 1]
Block 1
x1(w1)
y2[1] y2[2] y2[B]
y3[1] y3[2] y3[B]y3[B 1]
R 0
a
b
x1(w2) x1(wB1)
f(B1)2 ({y2[i]}B2i=1 )f
(2)2 (y2[1]) f
(B)2 ({y2[i]}B1i=1 )
B blocks, each of length n channel uses. Message W = (w1, w2, . . . , wB)
Block 1 fills the pipeline; and block B empties the pipelineBlocks 2 through B can blend broadcast and multiaccess
Extremes
n = 1, B large: Like an intersymbol interference (ISI) channeln large, B = 2: No interblock interference, half-duplex
n,B large: Effective rate (B1)nRnB =(B1)B R R
Memory within and among input blocks allowed through f(i)2
({y2[j]
}i1j=1
)Abdellatif Zaidi Cooperative Techniques in Networks 16 / 90
Basics on Cooperation Protocols : Amplify-and-Forward (AF)
Amplify-and-Forward (AF)
Relay
Destination
Source
Block BBlock B-1Block 2
R R
y2[B 1]
Block 1
x1(w1)
y2[1] y2[2] y2[B]
y3[1] y3[2] y3[B]y3[B 1]
R 0
a
b
x1(w2) x1(wB1)
y2[1] y2[B 1]y2[B 2]
Choose f(i)2 () to be a linear function:
Discrete Channels
Often x2[i] , f(i)2
({y2[j]
}i1j=1
):= y2[i 1]
Requires Y2 X2Continuous Channels
Often x2[i] , f(i)2
({y2[j]
}i1j=1
):= y2[i 1]
chosen subject to a power constraint
Abdellatif Zaidi Cooperative Techniques in Networks 17 / 90
Basics on Cooperation Protocols : Amplify-and-Forward (AF)
Multihop AF, B = 2
Relay
Destination
Source
Block 2
R 0
Block 1
y2[1] y2[2]
y2[1]
y3[1] y3[2]
a
b
x1(w)
If Node 3 decodes using only Block 2, R achievable if
R < Rmaf := maxPX1,1|X2,1 (|b),a,b
1
2I(X1,1;Y3,2|X2,1 = b,X1,2 = a)
I(X1,1;Y3,2|X2,1 = b,X1,2 = a) computed for the effective channelPY3,2|X1,1X1,2X2,1(|, a, b) in which X2,2 = Y2,1Also known as non-regenerative repeating
Abdellatif Zaidi Cooperative Techniques in Networks 18 / 90
Basics on Cooperation Protocols : Amplify-and-Forward (AF)
Diversity AF, B = 2
Relay
Destination
Source
Block 2
R 0
Block 1
y2[1] y2[2]
y2[1]
y3[1] y3[2]
a
b
x1(w)
If Node 3 decodes using both Blocks 1 and 2, then R achievable for
R < Rdaf := maxPX1,1|X2,1 (|b),a,b
1
2I(X1,1;Y3,1Y3,2|X2,1 = b,X1,2 = a)
Similar to repetition coding, except X2,2 = Y2,1 is a corrupted version of X1,1
Abdellatif Zaidi Cooperative Techniques in Networks 19 / 90
Basics on Cooperation Protocols : Amplify-and-Forward (AF)
Non-Orthogonal AF (NAF), B = 2
Relay
Destination
Source
Block 2
R R
Block 1
y2[1] y2[2]
y2[1]
y3[1] y3[2]
b
x1(w1) x1(w2)
If Node 1 sends new information in Block 2, and Node 3 decodes using bothBlocks 1 and 2, then R achievable for
R < Rndaf := max1
2I(X1,1X1,2;Y3,1Y3,2|X2,1 = b)
with max over PX1,1|X2,1(|b)PX1,2|X2,1(|b) and bA combination of DAF and Direct from Node 1 to Node 3
Abdellatif Zaidi Cooperative Techniques in Networks 20 / 90
Basics on Cooperation Protocols : Amplify-and-Forward (AF)
Intersymbol Interference AF (IAF)
Pipeline with n = 1, B creates an effective intersymbol interference (ISI)channel
Input memory important in this case (waterfilling)
Additional improvements with bursty AF, mainly at low SNR
Abdellatif Zaidi Cooperative Techniques in Networks 21 / 90
Basics on Cooperation Protocols : Amplify-and-Forward (AF)
AF Summary
Schemes discussed so far are all special cases of Block Pipeline AF.
GenerallyRmaf < Rdaf < Rndaf < Riaf
but coding and decoding grows increasingly complex
MAF, DAF, and NDAF with B = 2 are useful for half-duplex systems
IAF with B is useful for full-duplex systems
Abdellatif Zaidi Cooperative Techniques in Networks 22 / 90
Basics on Cooperation Protocols : Decode-and-Forward (DF)
Decode-and-Forward (DF)
Relay
Destination
Source
Block BBlock B-1Block 2
x1(w1, w2)
R R
y2[B 1]
Block 1
x1(1, w1) x1(wB1, 1)
y2[1] y2[2] y2[B]
x2(1) x2(w1) x2(wB2) x2(wB1)
y3[1] y3[2] y3[B]y3[B 1]
x1(wB2, wB1)
R 0
f(i)2 (): in block i, Relay uses y2[i 1] to decode message wi1, and re-encodes it
into x2(wi1)
Joint typicality decoding: look at wi1 s.t. x1(wi2, wi1), y2[i 1] is jointlytypical given x2(wi2)
Multi-user codebooks designed jointly
x1[i] := x1(wi1, wi) and x2[i] := x2(wi1) both depend upon wi1x1[i] := x1(wi1, wi) also depends upon wiJoint distributions PX2()PX1|X2()Abdellatif Zaidi Cooperative Techniques in Networks 23 / 90
Basics on Cooperation Protocols : Decode-and-Forward (DF)
Multihop DF, B = 2
Relay
Destination
Source
Block 2
R 0
Block 1
y2[1] y2[2]
x2(w)
y3[1] y3[2]
a
b
x1(w)
Let q be the fractional length of Block 1 (Block 2 of fractional length q := 1 q)At the end of Block 1, Node 2 decodes message w reliably if
R < qI(X1;Y2|X2 = b)
and w = w with high probability
If Node 3 decodes using only Block 2, R achievable if
R < qI(X2;Y3|X1 = a)
Abdellatif Zaidi Cooperative Techniques in Networks 24 / 90
Basics on Cooperation Protocols : Decode-and-Forward (DF)
Multihop DF (MDF), B = 2
Relay
Destination
Source
Block 2
R 0
Block 1
y2[1] y2[2]
x2(w)
y3[1] y3[2]
a
b
x1(w)
Thus, a rate R is achievable if
R < Rmdf := max0q1
min{q maxPX1|X2 (|b),b
I(X1;Y2|X2 = b),
= q maxPX2|X1 (|a),a
I(X2;Y3|X1 = a)}
Routing: Time-sharing between Direct from Node 1 to Node 2 and Direct fromNode 2 to Node 3
Also known as regenerative repeating
Abdellatif Zaidi Cooperative Techniques in Networks 25 / 90
Basics on Cooperation Protocols : Decode-and-Forward (DF)
Diversity DF (DDF), B = 2
Relay
Destination
Source
Block 2
R 0
Block 1
y2[1] y2[2]
x2(w)
y3[1] y3[2]
a
b
x1(w)
If Node 3 decodes using both Blocks 1 and 2, then R achievable for
R < Rddf := max min{qI(X1,1;Y2,1|X2,1 = b),qI(X1,1;Y3,1|X2,1 = b)+ qI(X2,2;Y3,2|X1,2 = a)}
with max over PX1,1|X2,1(|b)PX2,2|X1,2(|a), a, b, and 0 q 1
Abdellatif Zaidi Cooperative Techniques in Networks 26 / 90
Basics on Cooperation Protocols : Decode-and-Forward (DF)
Non-Orthogonal DDF (NDDF), B = 2
Relay
Destination
Source
Block 2
R 0
Block 1
y2[1] y2[2]
b x2(w1)
y3[1] y3[2]
x1(w1) x1(w2)
If Node 1 sends new information in Block 2, and Node 3 decodes using bothBlocks 1 and 2, then R achievable for
R < Rnddf,2 := max min{qI(X1,1;Y2,1|X2,1 = b),qI(X1,1;Y3,1|X2,1 = b)+ qI(X1,2X2,2;Y3,2)}
with max over PX1,1|X2,1(|b)PX1,2X2,2(), a, b, and 0 q 1Abdellatif Zaidi Cooperative Techniques in Networks 27 / 90
Basics on Cooperation Protocols : Decode-and-Forward (DF)
Non-Orthogonal DDF (NDDF), B
Relay
Destination
Source
Block BBlock B-1Block 2
x1(w1, w2)
R R
y2[B 1]
Block 1
x1(1, w1) x1(wB1, 1)
y2[1] y2[2] y2[B]
x2(1) x2(w1) x2(wB2) x2(wB1)
y3[1] y3[2] y3[B]y3[B 1]
x1(wB2, wB1)
R 0
All blocks of length n
Three encoding and decoding algorithms: differ in complexity and delay
requirements.
Regular enc., sliding-window dec.: R < I(X2;Y3) + I(X1;Y3|X2)Regular enc., backward dec.: R < I(X1, X2;Y3)Irregular enc., successive dec.: binning at encoding (Cover, El Gamal)
Abdellatif Zaidi Cooperative Techniques in Networks 28 / 90
Basics on Cooperation Protocols : Decode-and-Forward (DF)
Non-Orthogonal DDF (NDDF), B
Relay
Destination
Source
Block BBlock B-1Block 2
x1(w1, w2)
R R
y2[B 1]
Block 1
x1(1, w1) x1(wB1, 1)
y2[1] y2[2] y2[B]
x2(1) x2(w1) x2(wB2) x2(wB1)
y3[1] y3[2] y3[B]y3[B 1]
x1(wB2, wB1)
R 0
All blocks of length n
Rate R is achievable if
R < Rnddf, := max min{I(X1;Y2|X2), I(X1X2;Y3)}
with max over PX1X2()Rnddf, is in fact capacity if relay channel is physically degraded, i.e,
PY2Y3|X1X2() = PY2|X1X2()PY3|Y2X2()
Abdellatif Zaidi Cooperative Techniques in Networks 29 / 90
Basics on Cooperation Protocols : Decode-and-Forward (DF)
DF Summary
Schemes discussed so far are all special cases of Block Pipeline DF.
GenerallyRmdf < Rddf < Rnddf,2 < Rnddf,
but coding and decoding grows increasingly complex
MDF, DDF, and NDDF with B = 2 are useful for half-duplex systems
NDDF with B is useful for full-duplex systems
Abdellatif Zaidi Cooperative Techniques in Networks 30 / 90
Basics on Cooperation Protocols : Compress-and-Forward (CF)
Multihop CF, B = 2
Relay
Destination
Source
Block 2
R 0
Block 1
y2[1] y2[2]
x2(s1)
y3[1] y3[2]
a
b
x1(w)
s1 is s.t. y2[s1] := y2[1]
Basic Idea: in block i, relay quantizes (scalar or vector) y2[i 1] andcommunicates it to the destination.
Details
Fix distributions PX1|X2(|b) and PY2|Y2X2 , to be optimized laterGenerate d2nRe quantization codewords y2[s] independently and i.i.d.according to the marginal PY2|X2(y2|b), s = 1, 2, . . . , 2
nR.
Abdellatif Zaidi Cooperative Techniques in Networks 31 / 90
Basics on Cooperation Protocols : Compress-and-Forward (CF)
Multihop CF
Details (cont.)
Upon receiving y2[1], relay quantizes it by finding a joint typical y2[1] in thequantization codebook. This is likely for n large if R > I(Y2; Y2|X2 = b)The destination first utilizes y3[2] to decode the compression index s1. This canbe done with no error if
R < maxPX2|X1 (|a),a
I(X2;Y3|X1 = a)
Then, the destination utilizes y2[s1] := y2[1] to decode message w. This can bedone with no error if
R < Rmcf := max I(X1; Y2|X2 = b)
with max over PX1|X2(|b), PY2|Y2X2(|, b), a and b such thatI(Y2; Y2|X2 = b) < maxPX2|X1 (|a) I(X2;Y3|X1 = a)
Abdellatif Zaidi Cooperative Techniques in Networks 32 / 90
Basics on Cooperation Protocols : Compress-and-Forward (CF)
CF Wyner-Ziv Compression
Y n3Xn1 WW
Y n2 : Xn2
Y n2
Send B 1 independent messages over B blocks (each of length n)At the end of block i, relay chooses a description yn2 [i] of y
n2 [i]
Since the receiver has side information yn3 [i] about yn2 [i], we use Wyner-Ziv coding
to reduce rate necessary to send yn2 [i]
R > I(Y2; Y2|X2, Y3)
= I(Y2; Y2|X2) I(Y3; Y2|X2)
The bin index is sent to the receiver in block i+ 1 via xn2 [i+ 1]
Abdellatif Zaidi Cooperative Techniques in Networks 33 / 90
Basics on Cooperation Protocols : Compress-and-Forward (CF)
CF Wyner-Ziv Compression (Cont.)
Y n3Xn1 WW
Y n2 : Xn2
Y n2
At the end of block i+ 1, the receiver first decodes xn2 [i+ 1] from which it findsyn2 [i]
R < I(X2;Y3)
It then finds unique wi s.t. xn1 (wi), x
n2 [i], y
n2 [i], y
n3 [i] are jointly typical
R < I(X1; Y2, Y3|X2)
Compress-Forward rate
RCF = maxPX1PX2PY2|Y2,X2
I(X1; Y2, Y3|X2)
subject to I(Y2; Y2|X2, Y3) I(X2;Y3)
Abdellatif Zaidi Cooperative Techniques in Networks 34 / 90
Basics on Cooperation Protocols : Compress-and-Forward (CF)
Summary
What we covered in this section:
Summarized basic elements of relay channels, including direct, multihop,broadcast, and multiaccess transmission
Introduced mechanics of various kinds of relay processing, includingamplify-and-forward, decode-and-forward, and compress-and-forward
Abdellatif Zaidi Cooperative Techniques in Networks 35 / 90
Basics on Cooperation Information Rates
Information Rates
The purpose of this section is to refine the above analysis, study numericalexamples, and develop insight based on rate.
Sc
S
The capacity region C is usually difficult to compute. A useful outer bound on C isthe cut-set bound.
Let S N and let Sc be the complement of S in N. A cut separating Wm fromone of its estimates Wm(u) is a pair (S, S
c) where Wm is connected (immediately)to a node in S but not in Sc, and where u Sc.
Abdellatif Zaidi Cooperative Techniques in Networks 36 / 90
Basics on Cooperation Information Rates : Cut-set Bound
Cut-Set Bound
Let XS = {Xu : u S}Consider any choice of encoders, and compute
PXNYN (a, b) =[ 1n
ni=1
PXN,i(a)]PYN|XN (b|a)
where PXN,i() is the marginal input distribution at time i.Let M(S) be the set of messages separated from one of their sinks by the cut(S, Sc).
Cut bound: any (R1, R2, . . . , RM ) C satisfiesmM(S)
Rm I(XS;YSc |XSc)
Cut-set bound for fixed PXN (): intersection over all S of (R1, . . . , RM ) satisfyingthe above bounds
Cut-set bound: union over PXN () of all such regions
Abdellatif Zaidi Cooperative Techniques in Networks 37 / 90
Basics on Cooperation Information Rates : Cut-set Bound
Cut-Set Bound Examples
Point-to-point channel:C PX ()I(X;Y )
Relay channel:
C PX1X2() min{I(X1;Y2Y3|X2), I(X1X2;Y3)}
The cut-set bound is usually loose, e.g., for two-way channels, broadcast channels,relay channels, etc.
Multiple access Broadcast
X1 Y3
X2
X1
Y2 : X2
Y3
Abdellatif Zaidi Cooperative Techniques in Networks 38 / 90
Basics on Cooperation Information Rates : Wireless Geometry
Wireless Geometry
Relay is a full-duplex device
Powers and Noise
E[X2u] Pu, u = 1, 2Zi, i = 2, 3, ind. Gaussian
E[Z2i ] = N , i = 2, 3Source and Dest. kept fixed. Relaymoves on the circle. The model is:
Y2 =H12|d|/2
X1 + Z2
Y3 = H13X1 +H23
1 d2/2
X2 + Z3
1
d
1 d2
2
1 3
To compare rates, we will consider two settings:
No Fading: Huv is a known constant, CSIR + CSITFast Uniform-phase fading: Huv = e
j2uv where uv is uniform in[0, 2), with CSIR, No CSIT
Abdellatif Zaidi Cooperative Techniques in Networks 39 / 90
Basics on Cooperation Information Rates : CF
CF Rates for AWGN Channels
Recall the Compress-Forward Lower Bound in the DM Case
C maxPX1PX2PY2|Y2,X2
I(X1; Y2, Y3|X2)
subject to I(Y2; Y2|X2, Y3) I(X2;Y3)For AWGN channels, a natural choice is
Y2 = Y2 + Z2
where Z2 is Gaussian with variance N2.
X1 and X2 are chosen as independent, Gaussian, and with variances P1 and P2,respectively.
The smallest possible N2 is when
I(Y2; Y2|X2Y3) = I(X2;Y3)
which gives
N2 = NP1(|H12|2/d12 + |H13|2/d13
)+N
P2|H23|2/d23
Abdellatif Zaidi Cooperative Techniques in Networks 40 / 90
Basics on Cooperation Information Rates : CF
CF Rates for AWGN Channels
For full-duplex relays
R = I(X1; Y2, Y3|X2)
= log2
(1 +
P1|H12|2
d12(N + N2)+P1|H13|2
d13N
)bits/use
Comments:
As SNR23 := |H23|2P2/d23N we have N2 0 and Y2 Y2 andR becomes
R = maxPX1X2 ()
I(X1;Y2Y3|X2)
This is a cut rate so CF is optimal as SNR23 .Important insight: use CF when the relay is near the destination, andnot AF or DF (see the rate figure).
Abdellatif Zaidi Cooperative Techniques in Networks 41 / 90
Basics on Cooperation Information Rates : DF
DF Rates for AWGN Channels
Recall the DF block structure where x2(wb1) is generated by PX2 , andx1(wb1, wb) by PX1|X2(|x2,i(wb1)) for all i.
After block b, the DF relay decodes wb at rate
R < I(X1;Y2|X2)
Backward decoder rate: decode wb after block b+ 1 by using y3[B],b = B,B 1, . . . , 1, at rate
R < I(X1X2;Y3)
Sliding-window decoder rate: decode wb after block b+ 1 by using y3[b] andy3[b+ 1], b = 1, 2, . . . , B 1, at rate
R < I(X1;Y3|X2) + I(X2;Y3) = I(X1X2;Y3)
where the first information term is due to y3[b], and I(X2;Y3) is due to y3[b+ 1](treat x1(wb, wb+1) as interference).
DF rate:R = max
PX1X2
min {I(X1;Y3|X2), I(X1X2;Y3)}
Abdellatif Zaidi Cooperative Techniques in Networks 42 / 90
Basics on Cooperation Information Rates : DF
DF Rates for AWGN Channels
For AWGN channels, choose Gaussian PX1X2 with E[|X1|2] = P1, E[|X2|2] = P2and = E[X1X?2 ]/
P1P2.
The DF rate is
R = max
min{
log2
(1 +
P1|H12|2(1 ||2)d12N
),
log2
(1 +
P1|H13|2
d13N+P2|H23|2
d23N+
P1P2Re{H13H?23}
d13d23N
)}Comments:
As SNR12 := |H12|2P1/d12N , optimal 1 and R becomes
R = maxPX1X2
I(X1X2;Y3)
This is a cut rate so DF is optimal as SNR12 .Important insight: use DF when the relay is near the source (see therate figure).
Abdellatif Zaidi Cooperative Techniques in Networks 43 / 90
Basics on Cooperation Information Rates : AF
AF Rates for AWGN Channels
AF processing: set
X2,i = aY2,i1 = a(H12,i1
d12X1,i1 + Z2,i1
)where a is chosen to satisfy the relay power constraint.
Destination output:
Y3,i =H13,id13
X1,i + aH23,id23
(H12,i1d12
X1,i1 + Z2,i1)
+ Z3,i
AF effectively converts the channel into a unit-memory inter-symbol interferencechannel. The transmitter should thus perform a water-filling optimization of thespectrum of Xn1 .
It turns out the relay should not always transmit with maximum power.
More generally:
X2,i = ~a[Y2,i1, Y2,i2, , Y2,iD
]TAmounts to filtering-and-forwarding.
Abdellatif Zaidi Cooperative Techniques in Networks 44 / 90
Basics on Cooperation Information Rates : AF
Rates For AWGN Channels, No Fading
P1/N = 10 dB, P2/N = 20 dB, = 1, Huv = 1
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 11.5
2
2.5
3
3.5
4
distance d
Rat
e
cutset bound
Rate DF
Rate CF
Relay off
Abdellatif Zaidi Cooperative Techniques in Networks 45 / 90
Basics on Cooperation Information Rates : AF
Rates For Fast Uniform Phase Fading, CSIR and No CSIT
For CF, choose Y2 = Y2ej12 + Z2 where Z2 is Gaussian with 0-mean and var.
N2.
Straightforward algebra leads to same CF rate as for AWGN relay channels.
For DF, Straightforward algebra leads to
R = max
min{
log2
(1 +
P1(1 ||2)d12N
),
E[
log2
(1 +
P1d13N
+P2d23N
+ej(1213)
P1P2
d13d23N
)]}By Jensens inequality and E[ej(1213)] = 0, the best is zero!Intuition: Without phase knowledge, source and relay transmissions cannotcombine coherently.
Important insight: Coherent combining requires CSIT at either the source or relaynode and seems unrealistic in mobile environments
Abdellatif Zaidi Cooperative Techniques in Networks 46 / 90
Basics on Cooperation Information Rates : AF
Rates For Fast Uniform Phase Fading Channels, CSIR andNo CSIT
P1/N = 10 dB, P2/N = 20 dB, = 2, Huv = ej2uv
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 11.5
2
2.5
3
3.5
4
distance d
Rat
e
cutset bound
Rate DF
Rate CF
Relay off
Abdellatif Zaidi Cooperative Techniques in Networks 47 / 90
Basics on Cooperation Information Rates : AF
Summary
What we covered in Part I:
1 Summarized basic elements of relay channels, including direct, multihop,broadcast, and multiaccess transmission
2 Introduced mechanics of various kinds of relay processing, includingamplify-and-forward, decode-and-forward, and compress-and-forward
3 Reviewed a cut-set bound
4 Reviewed information theory for cooperative protocols, including AF, CF, DF.
5 Examples of rate gains for a wireless relay channel
6 Gave insight on protocol choice based on geometry and constraints
Abdellatif Zaidi Cooperative Techniques in Networks 48 / 90
Cooperation in Presence of Interference
Part II: Cooperation in Presence of Interference
Abdellatif Zaidi Cooperative Techniques in Networks 49 / 90
Cooperation in Presence of Interference
Goal
The purpose of this section is to give a high level overview of some issues thatarise in interference relay networks.
Show, through examples, that classic relaying techniques in general do not sufficeas such in such networks, and need be combined appropriately with more advancedtechniques.
Discuss what roles relays can play in such networks, in addition to adding power,reducing path-loss, combating fading and increasing coverage.
Abdellatif Zaidi Cooperative Techniques in Networks 50 / 90
Cooperation in Presence of Interference
Outline: Part 2
1 CF Generalization to Networks / Noisy Network Coding
2 Relaying in Presence of Additive Outside Interference
- Interference Not Known
- Interference Known Only to Relay
- Interference Known Only to Source
3 Generalisation: Channels with States
Abdellatif Zaidi Cooperative Techniques in Networks 51 / 90
Cooperation in Presence of Interference Noisy Network Coding
Wyner-Ziv Compression for Two Receivers or More
?
Y n3Xn1 WW
Y n2 : Xn2
Y n4 W
R = I(Y2; Y2|X2Y4) R = I(Y2; Y2|X2Y3)
CF is a good candidate for relaying in networks with no CSIT, such as mobileenvironments.
In networks, which side information to take into account ?
- Multiple-description coding not optimal and complex for many users!
Binning rate (and so, the overall rate) depends on the considered side information
More generally, Wyner-Ziv type compressions require the quantized version yn2 [i] tobe tailored for a specific receiver
The problem is even more complex in networks with more than two receivers!
Abdellatif Zaidi Cooperative Techniques in Networks 52 / 90
Cooperation in Presence of Interference Noisy Network Coding
Noisy Network Coding
W1 (X1,Y1)
WN (XN,YN)
p(y1, . . . ,yN|x1, . . . ,xN)
Wk (Xk,Yk)
Alternate compression: noisy network coding (Kim, El Gamal 2011)
Standard compression, i.e., no binning!
Every message is transmitted b times
Receiver decodes using all blocks without explicitly decoding the compressionindices
Abdellatif Zaidi Cooperative Techniques in Networks 53 / 90
Cooperation in Presence of Interference Noisy Network Coding
Noisy Network Coding: Outline of Proof
Source node sends same message b times; relays use compress-forward; decodersuse simultaneous decoding
No binning; dont require decoding compression indices correctly!
For simplicity, consider proof for relay channel
The relay uses independently generated compression codebooks:
Bj = {yn2 (lj |lj1) lj , lj1 [1 : 2nR2 ]}, j [1 : b]
lj1 is compression index of Yn2 (j 1) sent by the relay in block j
The senders use independently generated transmission codebooks:
Cj = {(xn1 (j,m), xn2 (lj1))m [1 : 2nbR], lj , lj1 [1 : 2nR2 ]}, j [1 : b]
Encoding: Sender transmits Xn1 (j,m) in block j [1 : b] Upon receiving Y n2 (j)and knowing Xn2 (lj1), the relay finds jointly typical Y
n2 (lj |lj1), and sends
Xn2 (lj) in block j + 1.
Abdellatif Zaidi Cooperative Techniques in Networks 54 / 90
Cooperation in Presence of Interference Noisy Network Coding
Example: Interference Channel with Intermittent Feedback
Encoder 1
Encoder 2
Decoder 1
Decoder 2
DM
IC
G
F
Xn1
Xn2 Yn4 W2
Y n3
Y n2
Y n1
WY1,Y
2,Y
3,Y
4|X
1,X
2,SW1
W2
S1
W1
S2
Feedback provided only intermittently: at time i:
- Feedback-event with proba. p1 on Dec. 1 Enc.1, and p2 on Dec. 2 Enc.2Pr{Y1[i] = Y3[i]} = p1 and Pr{Y2[i] = Y4[i]} = p2
- Erasure-event with proba. p1 on Dec. 1 Enc.1, and p2 on Dec. 2 Enc.2Pr{Y1[i] = } = 1 p1 and Pr{Y2[i] = } = 1 p2
? Can model this type of FB using a memoryless state-pair (S1, S2), with
S1 Bern-p1 and S2 Bern-p2, and (S1, S2) pS1,S2(s1, s2)Capacity region is unknown in general, even without feedback!
Classic partial-DF schemes inefficient here, because of the intermittence
Abdellatif Zaidi Cooperative Techniques in Networks 55 / 90
Cooperation in Presence of Interference Noisy Network Coding
Example: IC with Intermittent Feedback (Cont.)
Encoder 1
Encoder 2
Decoder 1
Decoder 2
DM
IC
G
F
Xn1
Xn2 Yn4 W2
Y n3
Y n2
Y n1
WY1,Y
2,Y
3,Y
4|X
1,X
2,SW1
W2
S1
W1
S2
Key idea: each transmitter compresses, a-la noisy network coding, its output FBand sends it to both receivers.
Optimal for linear deterministic IC model
Y3[i] = H11X1[i] + H12X2[i]
Y4[i] = H22X2[i] + H21X1[i]
Recovers known results on linear deterministic IC (Tse et al.) if p1 = p2 = 1
More generally, optimal for Costa-El Gamal injective deterministic IC model
- Details in [Zaidi Interference channels with generalized and intermittentfeedback, IEEE Trans. IT, 2014]
Abdellatif Zaidi Cooperative Techniques in Networks 56 / 90
Cooperation in Presence of Interference Interference Amplification
RC with Unknown Outside Interferer
Node 4 is an unknown interferer
E[X24 ] = QE[Z2i ] = 1, i = 2, 3E[X2i ] = 1, i = 1, 2
1
2
3 WW
4
We focus on the shown geometry. Node 4 interferes on both links, to the relay anddestination. The model is
Y2 = H12X1 +H42X4 + Z2
Y3 = H13X1 +H23X2 +H43X4 + Z3
Previous schemes may all perform poor if transmit power at Node 4 is too large(strong interferer).
Abdellatif Zaidi Cooperative Techniques in Networks 57 / 90
Cooperation in Presence of Interference Interference Amplification
Amplifying the Interference
Important insight:
Interference X4 is different from noise (has a structure !)
Treat the interference as desired information, and amplify it instead ofcombating it!
The destination first decodes the interference, cancels its effect, and thendecodes message W interference-free
Rationale:
Consider the following IC, powers and noise variances set to unity for simplicity.
SNR1 , |g11|2, SNR2 , |g22|2
INR1 , |g21|2, INR2 , |g12|2
W1W1
W2W2
g11
g22
g21
g12
Strong interference regime: INR1 SNR1 and INR2 SNR2Decoding interference is optimal in the strong interference regime.
The relay steers the network to the strong interference regime in which theinterference can be decoded and so its effect canceled out!
Abdellatif Zaidi Cooperative Techniques in Networks 58 / 90
Cooperation in Presence of Interference Interference Amplification
Amplifying the Interference
Important insight:
Interference X4 is different from noise (has a structure !)
Treat the interference as desired information, and amplify it instead ofcombating it!
The destination first decodes the interference, cancels its effect, and thendecodes message W interference-free
Rationale:
Consider the following IC, powers and noise variances set to unity for simplicity.
SNR1 , |g11|2, SNR2 , |g22|2
INR1 , |g21|2, INR2 , |g12|2
W1W1
W2W2
g11
g22
g21
g12
Strong interference regime: INR1 SNR1 and INR2 SNR2Decoding interference is optimal in the strong interference regime.
The relay steers the network to the strong interference regime in which theinterference can be decoded and so its effect canceled out!
Abdellatif Zaidi Cooperative Techniques in Networks 58 / 90
Cooperation in Presence of Interference Binning
Partially Known Interferer
To gain intuition, consider the Gaussian model
Y2 = X1 + S + Z2
Y3 = X1 +X2 + S + Z3
E[S2] = Q, E[X2i ] = Pi, i = 1, 2E[Z2i ] = Ni, i = 2, 3
1
2
3 WW
4
The interference S can be known to all or only a subset of the nodes
Node k, k = 1, 2, 3, knows the interference from Node 4
- strictly-causally: if, at time i, it knows Si1 , (S1, . . . , Si1)
- causally: if, at time i, it knows Si , (S1, . . . , Si1, Si)
- non-causally: if, at time i, it knows Sn , (S1, . . . , Si1, Si, . . . , Sn)
? In all cases, interference can be learned, e.g., through relaying or by
means of cognition.
In general, asymmetric models, are more difficult to solve!
Abdellatif Zaidi Cooperative Techniques in Networks 59 / 90
Cooperation in Presence of Interference Binning
Collaborative Binning Against Interference
Recall Costas Dirty Paper Coding for a point-to-point AWGN channel
- Input-output relation: Y = X + S + Z
E[X2] P , E[S2] = Q, E[Z2] = NS known non-causally to Tx, but not to Rx
-Optimal precoding: Tx sends X = U S, with = P/(P +N)Capacity C = I(U ;Y ) I(U ;S) = log2(1 + P/N)Intuition: Y = U + (1 )S +N , 1 0
Symmetric case: S is known to both source and relay (non-causally)
X2 = (1 )(U1 1S), =P1/
P (1),
X1 = (U1 1S) + (U2 2S) +X 1, X 1 N(0, P1)
with P (1) = (P1 +
P2)
2, P (2) = P1 and
k =P (k)
P (1) + P (2) + (P1 +N2), k = 1, 2.
eliminates completely the effect of the interference S! (optimal if channel isphysically degraded).
Abdellatif Zaidi Cooperative Techniques in Networks 60 / 90
Cooperation in Presence of Interference Binning
Collaborative Binning Against Interference
Recall Costas Dirty Paper Coding for a point-to-point AWGN channel
- Input-output relation: Y = X + S + Z
E[X2] P , E[S2] = Q, E[Z2] = NS known non-causally to Tx, but not to Rx
-Optimal precoding: Tx sends X = U S, with = P/(P +N)Capacity C = I(U ;Y ) I(U ;S) = log2(1 + P/N)Intuition: Y = U + (1 )S +N , 1 0
Symmetric case: S is known to both source and relay (non-causally)
X2 = (1 )(U1 1S), =P1/
P (1),
X1 = (U1 1S) + (U2 2S) +X 1, X 1 N(0, P1)
with P (1) = (P1 +
P2)
2, P (2) = P1 and
k =P (k)
P (1) + P (2) + (P1 +N2), k = 1, 2.
eliminates completely the effect of the interference S! (optimal if channel isphysically degraded).
Abdellatif Zaidi Cooperative Techniques in Networks 60 / 90
Cooperation in Presence of Interference Cognitive Relay
Interference Known Only at Relay
Case of no interference, with DF
The source knows the relay input Xn2The source can therefore jointly design Xn1 through PX1,X2(x1, x2)This ensures some coherence gain as in multi-antenna transmission
R < I(X1, X2;Y3)
Case of interference known only at relay, with DF
Issue: coherent transmission is difficult to obtainRelay should exploit the known Sn
X2,i = 2,i(Yi12 , S
n)
Source does not know Sn, and therefore X2,i
Abdellatif Zaidi Cooperative Techniques in Networks 61 / 90
Cooperation in Presence of Interference Cognitive Relay
Coding
Complete interference mitigation is impossible, due to the asymmetry
Main idea:
decompose relay input as
X2 = U1 + X2
X2 : zero mean Gaussian with variance P2, [0, 1]U1 : zero mean Gaussian with variance P2, = 1 U1 is independent of S and correlated with X1X2 is correlated with S and independent of X1
E[U1X1] = 12P1P2, E[X2S] = 2s
P2Q
X2 is generated using a Generalized DPC (2s 0)
U2 = X2 + S
Abdellatif Zaidi Cooperative Techniques in Networks 62 / 90
Cooperation in Presence of Interference Cognitive Relay
Coding
Complete interference mitigation is impossible, due to the asymmetry
Main idea:
decompose relay input as
X2 = U1 + X2
X2 : zero mean Gaussian with variance P2, [0, 1]U1 : zero mean Gaussian with variance P2, = 1 U1 is independent of S and correlated with X1X2 is correlated with S and independent of X1
E[U1X1] = 12P1P2, E[X2S] = 2s
P2Q
X2 is generated using a Generalized DPC (2s 0)
U2 = X2 + S
Abdellatif Zaidi Cooperative Techniques in Networks 62 / 90
Cooperation in Presence of Interference Cognitive Relay
Information Rate
Theorem
The capacity of the general Gaussian RC with interference known non-causally only atthe relay is lower-bounded by
RinG = max min
{1
2log(
1 +P1 + P2 + 2
12
P1P2
P2 +Q+N3 + 22sP2Q
)+
1
2log(
1 +P2(1 22s)
N3
),
1
2log(
1 +P1(1 212)
N2
)}
where the maximization is over parameters [0, 1], 12 [0, 1], 2s [1, 0], and = 1 .
Abdellatif Zaidi Cooperative Techniques in Networks 63 / 90
Cooperation in Presence of Interference Cognitive Relay
Upper Bounding Technique
Source Relay
R < I(X1;Y2, Y3, S|X2)= I(X1;Y2, Y3|S,X2)
(Source,Relay) Destination
R < I(X1, X2;Y3|S)
The term I(X1;S|Y3) is the rate loss due to not knowing the interference at thesource as well
Has connection with MAC with asymmetric CSI
But, with a different proof technique
Abdellatif Zaidi Cooperative Techniques in Networks 64 / 90
Cooperation in Presence of Interference Cognitive Relay
Upper Bounding Technique
Source Relay
R < I(X1;Y2, Y3, S|X2)= I(X1;Y2, Y3|S,X2)
(Source,Relay) Destination
R < I(X1, X2;Y3|S) I(X1;S|Y3)
The term I(X1;S|Y3) is the rate loss due to not knowing the interference at thesource as well
Has connection with MAC with asymmetric CSI
But, with a different proof technique
Abdellatif Zaidi Cooperative Techniques in Networks 64 / 90
Cooperation in Presence of Interference Cognitive Relay
How Tight is the Lower Bound ?
Proposition
For the physically degraded Gaussian RC, we have:1) If P1, P2, Q, N2, N3 satisfy
N2 max[1,0]
P1N3(P2 +Q+N3 + 2P2Q)
P1N3 + P2(1 2)(P1 + P2 +Q+N3 + 2P2Q)
,
then channel capacity is given by
CDG =1
2log(1 +
P1N2
).
2) If the maximum for the upper bound is attained at the boundary 212 + 22s = 1, then
the lower bound is tight.
Abdellatif Zaidi Cooperative Techniques in Networks 65 / 90
Cooperation in Presence of Interference Cognitive Relay
How Tight is the Lower Bound ? (cont.)
Bounds for GRC also meet in some extreme cases:
Arbitrarily strong interference, i.e., Q
C =
{min{ 1
2log(1 + P1
N2), 1
2log(1 + P2
N3)} (DG RC)
12
log(1 + P2N3
), if P2N3 P1
N2(General GRC)
Zero-power at the relay, i.e., P2 = 0
C =
{12
log(1 + P1Q+N3
) (DG RC)12
log(1 + P1Q+N3
), if P1Q+N3
P1N2
(General GRC)
No interference, i.e., Q = 0 (DG RC)
Abdellatif Zaidi Cooperative Techniques in Networks 66 / 90
Cooperation in Presence of Interference Cognitive Relay
The Deaf Helper Problem
What if the relay cannot hear the source ?
Finite interference
subtract as much as possible from Snot optimum in general
Arbitrarily stong interference
Constructed a dammed codeword X2 (independent of X1) by DPC,X2 = U2 SAt the destination: decode U2 first and then X1Cleans-up the channel for X1 if
I(U2;Y3) (U2;S) > 0.
Transmission at
I(X1;Y3|S) =1
2log2(1 +
P1N3
)
Abdellatif Zaidi Cooperative Techniques in Networks 67 / 90
Cooperation in Presence of Interference Cognitive Relay
Example : Degraded Gaussian RC
P1 = P2 = Q = N3 = 10 dB
0 5 10 15 20 25 300.2
0.4
0.6
0.8
1
1.2
P1/N
2 [dB]
Rate
Lower bound
Trivial upper bound
Upper bound
Trivial lower bound
0 5 10 15 20 25 300
0.5
1
2 12+
2 2s
Abdellatif Zaidi Cooperative Techniques in Networks 68 / 90
Cooperation in Presence of Interference Cognitive Source
RC with Interference Known Only at Source
Model
Y2 = X1 + S + Z2
Y3 = X1 +X2 + S + Z3
E[S2] = Q, E[X2i ] = Pi, i = 1, 2E[Z2i ] = Ni, i = 2, 3
1
2
3 WW
4
The interference S is known non-causally to only the source
Abdellatif Zaidi Cooperative Techniques in Networks 69 / 90
Cooperation in Presence of Interference Cognitive Source
Coding
Two different techniques:
1 Lower bound 1: reveal the interference to the relay
interference exploitation (binning) is performed also at relay
(share message and interference)
2 Lower bound 2: reveal to the relay just what the relay would send had the relay
known the interference
interference exploitation (binning) is performed only at source
(share X = (V, S), suitable for oblivious relaying)
Abdellatif Zaidi Cooperative Techniques in Networks 70 / 90
Cooperation in Presence of Interference Cognitive Source
Coding
Two different techniques:
1 Lower bound 1: reveal the interference to the relay
interference exploitation (binning) is performed also at relay
(share message and interference)
2 Lower bound 2: reveal to the relay just what the relay would send had the relay
known the interference
interference exploitation (binning) is performed only at source
(share X = (V, S), suitable for oblivious relaying)
Abdellatif Zaidi Cooperative Techniques in Networks 70 / 90
Cooperation in Presence of Interference Cognitive Source
Lower Bound 2
If the interference were also known at relay
Beginning of block i:
Source sends x1[i] := x1(wi1, wi) | v(wi1, j?V ),u(wi1, wi, j?U ), s[i]Relay sends x[i] | v(wi1, j?V ), s[i]
In our case (interference at only source)
The source knows wi1 and s[i], and so x[i]
The source transmits x[i] to the relay, ahead of time, in block i 1The relay estimates x[i] from y2[i 1], and sends x2[i] x[i] in block i
Abdellatif Zaidi Cooperative Techniques in Networks 71 / 90
Cooperation in Presence of Interference Cognitive Source
Lower Bound 2
Outline:
Beginning of block i:
Source looks for u(wi, j?i ) such that (u(wi, j
?i ), s[i]) Tn
Source looks for u(wi+1, j?i+1) such that (u(wi+1, j
?i+1), s[i+ 1]) Tn ; and then
computes x[i+ 1] | u(wi+1, j?i+1), s[i+ 1]Source quantizes x[i+ 1] into x[mi]
uR(mi, jRi)
u(wi, ji )
Martons coding Superposition coding
(wi, mi)
u(wi, jUi)
uR(mi, jRi)
v(wi1)
Abdellatif Zaidi Cooperative Techniques in Networks 72 / 90
Cooperation in Presence of Interference Cognitive Source
Lower Bound 2: Martons Coding
Theorem
The capacity of the DM RC with interference known only at source is lower bounded by
Rlo = max I(U ;Y3) I(U ;S)
subject to the constraint
I(X; X) < I(UR;Y2) I(UR;S) I(UR;U |S)
where maximization is over all joint measures onS U UR X1 X2 X X Y2 Y3 of the form
PS,U,UR,X1,X2,X,X,Y2,Y3
= QSPU,UR|SPX1|U,UR,SPX|U,SPX|X1X2=XWY2,Y3|X1,X2,S .
Abdellatif Zaidi Cooperative Techniques in Networks 73 / 90
Cooperation in Presence of Interference Cognitive Source
Lower Bound 2 (Gaussian Case)
Test channel:
X = aX + X, a := 1D/P2, X N(0, D(1D/P2)), 0 D P2X N(0, P2), with E[XX] = E[XS] = E[XS] = 0X1R N(0, P1), with E[X1RS] = 0, 0 1
U =( P1
P2+
P2 DP2
)X + S
UR = X1R + R(S +
P1
P1 +P2 D
X),
with
=(P1 +
P2 D)2
(P1 +
P2 D)2 + (N3 +D + P1)
R =P1
P1 +N2.
Abdellatif Zaidi Cooperative Techniques in Networks 74 / 90
Cooperation in Presence of Interference Cognitive Source
Lower Bound
Theorem
The capacity of the Gaussian RC with interference known only at source is lowerbounded by
RloG = max1
2log(
1 +(P1 +
P2 D)2
N3 +D + P1
),
where
D := P2N2
N2 + P1
and the maximization is over [0, 1], with := 1 .
Abdellatif Zaidi Cooperative Techniques in Networks 75 / 90
Cooperation in Presence of Interference Cognitive Source
How Tight are the Bounds ? (Contd)
Bounds for GRC meet in some extreme cases:
Arbitrarily small noise at relay, i.e., N2 0,
CG =1
2log(
1 +(P1 +
P2)
2
N3
) o(1)
where o(1) 0 as N2 0.Arbitrarily strong noise at relay, i.e., N2 ,
RupG =1
2log(1 +
P1N3
)
RloG =1
2log(1 +
P1N3 + P2
).
If P1 , bounds meet asymptotically in the power at the source if P2 P1,yielding
CG-orth =1
2log(1 +
P1N3
) + o(1)
Abdellatif Zaidi Cooperative Techniques in Networks 76 / 90
Cooperation in Presence of Interference Cognitive Source
Example
P1 = Q = N3 = 10 dB, P2 = 20 dB
20 10 0 10 20 30 40 500
0.2
0.4
0.6
0.8
1
1.2
1.4
1.6
1.8
2
P1/N
2 [dB]
Rate
Lower bound (Theorem 4)
Lower bound (Theorem 5)
Upper bound (Theorem 3)
Upper bound (Cutset bound)
Trivial lower bound
Abdellatif Zaidi Cooperative Techniques in Networks 77 / 90
Cooperation in Presence of Interference General Framework
RC with States
More generally, S may represent any information about the channel (fading, activity,..)
SOURCE DESTINATION
RELAYX2
Y3X1 W WWY2,Y3|X1,X2,SW W
Y2
Sn
RELAY
DESTINATIONSOURCEX1
Y2 X2
Y3 W WW W
Sn
WY2,Y3|X1,X2,S
Thourough results, as well as strictly-causal CSI case, in:
- Zaidi et al., Bounds on the Capacity of the Relay Channel with Noncausal Stateat Source, EEE Trans. on Inf. Theory, Vol. 59, No. 5, May 2013, pp. 2639-2672.
- Zaidi et al. Cooperative Relaying with State Available Non-Causally at theRelay, EEE Trans. Inf. Theory, vol. 5, no. 56, pp. 2272-2298, May 2010.
- Zaidi et al., Capacity Region of Cooperative Multiple Access Channel withStates, IEEE Trans. on Inf. Theory, Vol. 59, No. 10, 2013, pp. 6153-6174
Abdellatif Zaidi Cooperative Techniques in Networks 78 / 90
Cooperation in Presence of Interference General Framework
MAC with Delayed CSI
MACDecoder
Encoder 2
Encoder 1
Si1
Wc
Xn2
Xn1
Y n
WY |X1,X2,S(Wc, W1)
W1
Si1
Both encoders send Wc. Encoder 1 also sends W1
Both encoders know the states only strictly causally
1,i : WcW1 Si1 X1, i = 1, . . . , n
2,i : WcSi1 X2, i = 1, . . . , n
Decoder: : Yn WcW1
Abdellatif Zaidi Cooperative Techniques in Networks 79 / 90
Cooperation in Presence of Interference General Framework
Main Results
1 CSI given with delay to only transmitters increases the capacity region
- Zaidi et al., Cooperative MAC with states, IT 2013
2 Gains obtained through a Block-Markov coding in which the encoders jointlycompress the CSI of the last block, using an appropriate compression scheme
3 Reminiscient of quantizing-and-transmitting noise in a non-degraded BC examplewith common noise at receivers by Dueck (Cf: Partial feedback for two-way andBC, Inf. Contr. 1980)
- MAT scheme (Maddah Ali, Tse): interference S here. In block i,Transmitter sends a linear function of (S[i 1],X[i 1]). This can be seenas a compressed version S = V = f(X,S) PV |X,S
- Lapidoth et al., MAC with causal and strictly causal CSI, IT 2013
- Li et al., MAC with states known strictly causally, IT 2013
Abdellatif Zaidi Cooperative Techniques in Networks 80 / 90
Cooperation in Presence of Interference General Framework
Capacity Region in Some Special Cases
Let DsymMAC be the class of discrete memoryless two-user cooperative MACs, denoted byD
symMAC, in which the channel state S, assumed to be revealed strictly causally to both
encoders, can be obtained as a deterministic function of the channel inputs X1 and X2and the channel output Y , as
S = f(X1, X2, Y ).
Theorem
For any MAC in the class DsymMAC defined above, the capacity region Cs-c is given by theset of all rate pairs (Rc, R1) satisfying
R1 I(X1;Y |X2, S)Rc +R1 I(X1, X2;Y )
for some measurePS,X1,X2,Y = QSPX1,X2WY |S,X1,X2 .
Example: model Y = X1 +X2 + S. Capacity Rc +R1 log(1 + (P1 +
P2)
2/Q).
Abdellatif Zaidi Cooperative Techniques in Networks 81 / 90
Cooperation in Presence of Interference General Framework
Delayed CSI Not Always Helps !
Proposition
Delayed CSI at the encoders does not increase the sum capacity
max(Rc,R1) Cs-c
Rc +R1 = maxp(x1,x2)
I(X1, X2;Y ).
Proposition
Delayed CSI at only the encoder that sends both messages does not increase thecapacity region of the cooperative MAC.
Abdellatif Zaidi Cooperative Techniques in Networks 82 / 90
Interaction and Computation
Part III: Interaction and Computation
Abdellatif Zaidi Cooperative Techniques in Networks 83 / 90
Interaction and Computation
Outline: Part 3
1 Interaction for Sources Reproduction
2 Interaction for Function Computation
Abdellatif Zaidi Cooperative Techniques in Networks 84 / 90
Interaction and Computation
Setup
A B
M1
M2
...
Mt
fA(X, Y ) fB(X, Y )
YX
Under what conditions is interaction useful ?
How useful is interaction ?
What is the best way to interact ?
Abdellatif Zaidi Cooperative Techniques in Networks 85 / 90
Interaction and Computation
Interaction for Sources Reproduction Losslessly
Discrete memoryless multi-source (X1, Y1), . . . , (Xn, Yn) i.i.d. pX,Y (x, y)
Goal: Reproduce X = (X1, X2, . . . , Xn) at B with probability 1 as n
A BM1
R1 = H(X|Y )
(X1, . . . , Xn) (Y1, . . . , Yn)
(X1, . . . , Xn)
One round Slepian-Wolf Coding is optimal
Abdellatif Zaidi Cooperative Techniques in Networks 86 / 90
Interaction and Computation
Interaction for Lossy Sources Reproduction
Discrete memoryless multi-source (X1, Y1), . . . , (Xn, Yn) i.i.d. pX,Y (x, y)
Goal: Reproduce X = (X1, X2, . . . , Xn), with E[d(X, X)] DX as n
A B
M1
M2
...
Mt
(Y1, . . . , Yn)(X1, . . . , Xn)
X = (X1, . . . , Xn)Y = (Y1, . . . , Yn)
R2
Rt
R1
Interaction is useful
Rsum := R1 +R2 + . . .+Rt minPU1|X
I(U1;X|Y ) + minPU2|Y
I(U1;Y |X)
Abdellatif Zaidi Cooperative Techniques in Networks 87 / 90
Interaction and Computation
Interaction for Fonction Computation
Discrete memoryless multi-source (X1, Y1), . . . , (Xn, Yn) i.i.d. pX,Y (x, y)
Goal: Reproduce fA = fA(X,Y), with E[d( fA(X,Y), fA(X,Y))] DA asn
A B
M1
M2
...
Mt
(Y1, . . . , Yn)(X1, . . . , Xn)
R2
Rt
R1
fB(X,Y)fA(X,Y)
Interaction is useful
Abdellatif Zaidi Cooperative Techniques in Networks 88 / 90
Wrap-Up
Wrap-Up
1 Many approaches, benefits, and challenges to utilizing relaying and cooperativecommunications in networks.
2 Relaying also includes, or should also include, multihop (store-and-forward routing)and network coding.
3 The capacity of relay systems is difficult to analyze, but the theory is surprisinglyflexible and diverse.
4 Although generalizations to networks with many nodes are in general not easy,there are schemes which scale appropriately.
5 In networks with interference, relaying can help, not only by adding power/energyspatially, but also by allowing distributed interference cancellation; and in generalthis boots network capacity.
6 To cope with network interference efficiently, classic relaying schemes as such ingeneral do not suffice, and need be combined carefully with other appropriatetechniques.
Abdellatif Zaidi Cooperative Techniques in Networks 89 / 90
Wrap-Up
Selected References
- T. Cover and A. El Gamal, Capacity theorems for the relay channel, IEEE Trans.Inf. Theory, Vol. 25, Sep. 1979, pp. 572-584.
- G. Kramer, M. Gastpar and P. Gupta, Cooperative strategies and capacitytheorems for relay networks, IEEE Trans. Inf. Theory, Sep. 2005, pp. 3037-3037.
- N. Laneman, D. Tse and G. Wornell, Cooperative diversity in wireless networks:Efficient protocols and outage behaviour, IEEE Trans. Inf. Theory, vol. 50, Dec.2004, pp. 3062-3080.
- A. Zaidi, S. Kotagiri and N. Laneman, Cooperative relaying with State AvailableNon-Causally at the Relay, EEE Trans. Inf. Theory, vol. 5, no. 56, pp.2272-2298, May 2010.
- A. Zaidi et al., Bounds on the capacity of the relay channel with noncausal stateat source, EEE Trans. on Inf. Theory, Vol. 59, No. 5, May 2013, pp. 2639-2672.
- A. Zaidi et al., Capacity region of cooperative multiple access channel withstates, IEEE Trans. on Inf. Theory, vol. 59, No. 10, 2013, pp. 6153-6174
- S.-H. Lim, Y.-H. Kim and A. El Gamal, Noisy Network Coding, IEEE Trans. Inf.Theory, vol. 57, May 2011, pp. 3132-3152.
- A. Avestimeher, S. Diggavi and D. Tse, Wireless network information flow: adetermenistic approach, IEEE Trans. Inf. Theory, May 2011, pp. 3132-3152.
- A. El Gamal and Y.H-. Kim, Network information theory, Cambridge UniversityPress, 2011.Abdellatif Zaidi Cooperative Techniques in Networks 90 / 90
Basics on CooperationIntroduction and ModelsProtocolsProtocols : Amplify-and-Forward (AF)Protocols : Decode-and-Forward (DF)Protocols : Compress-and-Forward (CF)Information RatesInformation Rates : Cut-set BoundInformation Rates : Wireless GeometryInformation Rates : CFInformation Rates : DFInformation Rates : AF
Cooperation in Presence of InterferenceNoisy Network CodingInterference AmplificationBinningCognitive RelayCognitive SourceGeneral Framework
Interaction and ComputationWrap-Up