Network Information Theory

Reference:

Elements of Information Theory, 2nd Edition

Slides of EE4560, TUD

Introduction-Network IT Features

  • Multiple senders and/or multiple receivers. E.g., computer networks, satellite networks…

在这里插入图片描述

  • New elements in the communication problem. E.g., interference cooperation

  • Problem description:

    Given the senders, the receivers, and the channel transition matrix (that describes the effects of the interference and the noise in the network), can the sources be transmitted over the channel?

  • The general problem is tough and has not yet been solved → \to we focus on interesting special cases

Discrete Memoryless Multiple-Access Channel

A discrete memoryless multiple-access channel consists of

  • m m m input alphabets X 1 , X 2 , ⋯   , X m \mathcal X_1,\mathcal X_2,\cdots,\mathcal X_m X1,X2,,Xm
  • an output alphabet Y \mathcal Y Y, and
  • a probability transition matrix p ( y ∣ x 1 , x 2 , ⋯   , x m ) p(y|x_1,x_2,\cdots,x_m) p(yx1,x2,,xm)

在这里插入图片描述

For simplicity, we will further consider the case m = 2 m=2 m=2.

Definition 1 (Code for the MA channel):

在这里插入图片描述

A ( ( 2 n R 1 , 2 n R 2 ) , n ) \left(\left(2^{n R_{1}}, 2^{n R_{2}}\right), n\right) ((2nR1,2nR2),n) code for the multiple-access channel consists of

  • two message sets W i = { 1 , 2 , … , 2 n R i } , i = 1 , 2 \mathcal{W}_{i}=\left\{1,2, \ldots, 2^{n R_{i}}\right\}, i=1,2 Wi={1,2,,2nRi},i=1,2,
  • two encoding functions X i : W i → X i n , i = 1 , 2 X_{i}: \mathcal{W}_{i} \rightarrow \mathcal{X}_{i}^{n}, i=1,2 Xi:WiXin,i=1,2,
  • a decoding function g : Y n → W 1 × W 2 g: \mathcal{Y}^{n} \rightarrow \mathcal{W}_{1} \times \mathcal{W}_{2} g:YnW1×W2.

Definition 2 (Average probability of error):

The average probability of error for a ( ( 2 n R 1 , 2 n R 2 ) , n ) \left(\left(2^{n R_{1}}, 2^{n R_{2}}\right), n\right) ((2nR1,2nR2),n) code is
P e ( n ) = 1 2 n ( R 1 + R 2 ) ∑ ( w 1 , w 2 ) Pr ⁡ { g ( Y n ) ≠ ( w 1 , w 2 ) ∣ ( w 1 , w 2 ) }  sent P_{e}^{(n)}=\frac{1}{2^{n\left(R_{1}+R_{2}\right)}} \sum_{\left(w_{1}, w_{2}\right)} \operatorname{Pr}\left\{g\left(Y^{n}\right) \neq\left(w_{1}, w_{2}\right) |\left(w_{1}, w_{2}\right)\right\}\text{ sent} Pe(n)=2n(R1+R2)1(w1,w2)Pr{g(Yn)=(w1,w2)(w1,w2)} sent
Definition 3 (Achievable):

A rate pair ( R 1 , R 2 ) \left(R_{1}, R_{2}\right) (R1,R2) is said to be achievable if there exists a sequence of ( ( 2 n R 1 , 2 n R 2 ) , n ) \left(\left(2^{n R_{1}}, 2^{n R_{2}}\right), n\right) ((2nR1,2nR2),n) codes with P e ( n ) → 0 P_{e}^{(n)} \rightarrow 0 Pe(n)0.

Definition 4 (Capacity region):

The capacity region of the MA channel is the closure of the set of achievable ( R 1 , R 2 ) \left(R_{1}, R_{2}\right) (R1,R2) rate pairs.

Theorem 1 (MA channel capacity theorem):

The capacity region of the multiple-access channel
( X 1 × X 2 , p ( y ∣ x 1 , x 2 ) , Y ) \left(\mathcal{X}_{1} \times \mathcal{X}_{2}, p\left(y | x_{1}, x_{2}\right), \mathcal{Y}\right) (X1×X2,p(yx1,x2),Y)
is the closure of the convex hull of all ( R 1 , R 2 ) \left(R_{1}, R_{2}\right) (R1,R2) satisfying
R 1 < I ( X 1 ; Y ∣ X 2 ) R 2 < I ( X 2 ; Y ∣ X 1 ) R 1 + R 2 < I ( X 1 , X 2 ; Y ) \begin{array}{l} R_{1}<I\left(X_{1} ; Y |X_{2}\right) \\ R_{2}<I\left(X_{2} ; Y | X_{1}\right) \\ R_{1}+R_{2}<I\left(X_{1}, X_{2} ; Y\right) \end{array} R1<I(X1;YX2)R2<I(X2;YX1)R1+R2<I(X1,X2;Y)
for some product distribution on X 1 × X 2 \mathcal{X}_{1} \times \mathcal{X}_{2} X1×X2.

Achievable region for a particular input distribution: (Note that different input distributions will have the same shape but with different values. The capacity region is the convex hull of all achievable regions of different distributions. See [Exercise](# Exercise))

在这里插入图片描述

  • I ( X 1 ; Y ∣ X 2 ) I\left(X_{1} ; Y| X_{2}\right) I(X1;YX2): The mutual information of X 1 X_1 X1 and Y Y Y if we know X 2 X_2 X2. In application, this means that User 1 has the complete control of User 2.
  • I ( X 2 ; Y ) I\left(X_{2} ; Y\right) I(X2;Y): The mutual information of X 2 X_2 X2 and Y Y Y. We do not know anything about X 1 X_1 X1.
  • Onion-peeling: Consider decoding as a two-stage process
    • In the first stage, the receiver decodes the second sender, considering the first sender as part of the noise. This decoding will have low probability of error if R 2 < I ( X 2 ; Y ) R_2<I(X_2;Y) R2<I(X2;Y).
    • After the second sender has been decoded successfully, it can be subtracted out and the first sender can be decoded correctly if R 1 < I ( X 1 ; Y ∣ X 2 ) R_1<I(X_1;Y|X_2) R1<I(X1;YX2).
  • Maximizing R 1 R_1 R1 if R 2 = 0 R_2=0 R2=0: I ( X 1 ; Y ∣ X 2 ) ≤ max ⁡ x 2 I ( X 1 ; Y ∣ X 2 = x 2 ) I(X_1;Y|X_2)\le \max_{x_2}I(X_1;Y|X_2=x_2) I(X1;YX2)maxx2I(X1;YX2=x2). Hence, set X 2 X_2 X2 equal to this x 2 x_2 x2, and choose p ( x 1 ) p(x_1) p(x1) to maximize the mutual information between X 1 X_1 X1 and Y Y Y given X 2 = x 2 X_2=x_2 X2=x2.

Binary Multiplier Channel

在这里插入图片描述

Binary Erasure MA Channel

在这里插入图片描述

Note: R 1 = 1 R_1=1 R1=1 means that P ( X 1 = 0 ) = P ( X 1 = 1 ) = 1 / 2 P(X_1=0)=P(X_1=1)=1/2 P(X1=0)=P(X1=1)=1/2. Then P ( Y = 0 ∣ X 2 = 0 ) = P ( X 1 = 0 ) = 1 / 2 P(Y=0|X_2=0)=P(X_1=0)=1/2 P(Y=0X2=0)=P(X1=0)=1/2, P ( Y = 2 ∣ X 2 = 1 ) = P ( X 1 = 1 ) = 1 / 2 P(Y=2|X_2=1)=P(X_1=1)=1/2 P(Y=2X2=1)=P(X1=1)=1/2. From the perspective of User 2, P ( Y = 0 ∣ X 2 = 0 ) = 1 / 2 P(Y=0|X_2=0)=1/2 P(Y=0X2=0)=1/2, P ( Y = 1 ∣ X 2 = 1 ) = 1 / 2 P(Y=1|X_2=1)=1/2 P(Y=1X2=1)=1/2.

Gaussian MA Channel

在这里插入图片描述

Received signal at time i i i is Y i = X 1 i + X 2 i + Z i Y_{i}=X_{1 i}+X_{2 i}+Z_{i} Yi=X1i+X2i+Zi, where

  • Z i Z_{i} Zi is a sequence of i.i.d. zero-mean Gaussian random variables with variance N N N,

  • sender j j j satisfies a power constraint P j P_{j} Pj :

1 n ∑ i = 1 n x j i 2 ( w j ) ≤ P j , w j ∈ { 1 , 2 , … , 2 n R j } , j = 1 , 2 \frac{1}{n} \sum_{i=1}^{n} x_{j i}^{2}\left(w_{j}\right) \leq P_{j}, w_{j} \in\left\{1,2, \ldots, 2^{n R_{j}}\right\}, j=1,2 n1i=1nxji2(wj)Pj,wj{1,2,,2nRj},j=1,2

The capacity region is the convex hull of the set of rate pairs ( R 1 , R 2 ) \left(R_{1}, R_{2}\right) (R1,R2) satisfying
R 1 ≤ I ( X 1 ; Y ∣ X 2 ) ≤ C ( P 1 / N ) R 2 ≤ I ( X 2 ; Y ∣ X 1 ) ≤ C ( P 2 / N ) R 1 + R 2 ≤ I ( X 1 , X 2 ; Y ) ≤ C ( ( P 1 + P 2 ) / N ) , \begin{aligned} R_{1} & \leq I\left(X_{1} ; Y |X_{2}\right) && \leq C\left(P_{1} / N\right) \\ R_{2} & \leq I\left(X_{2} ; Y| X_{1}\right) && \leq C\left(P_{2} / N\right) \\ R_{1}+R_{2} & \leq I\left(X_{1}, X_{2} ; Y\right) && \leq C\left(\left(P_{1}+P_{2}\right) / N\right), \end{aligned} R1R2R1+R2I(X1;YX2)I(X2;YX1)I(X1,X2;Y)C(P1/N)C(P2/N)C((P1+P2)/N),
where C ( x ) = 1 2 log ⁡ ( 1 + x ) C(x)=\frac{1}{2} \log (1+x) C(x)=21log(1+x). The maximizing input distribution is X j ∼ N ( 0 , P j ) X_{j} \sim \mathcal{N}\left(0, P_{j}\right) XjN(0,Pj).

在这里插入图片描述

Note: From the term C ( P 2 P 1 + N ) C(\frac{P_2}{P_1+N}) C(P1+NP2) we can observe that the capacity is affected not only by the noise but also the interference of other sources.

Gaussian FDMA Capacity

在这里插入图片描述

Gaussian Naïve TDMA Capacity

在这里插入图片描述

Exercise

在这里插入图片描述

Find the capacity region for this channel.

在这里插入图片描述

I ( X 1 ; Y ∣ X 2 ) = I ( X 1 ; Y ) = H ( X 1 ) − H ( X 1 ∣ Y ) = H ( X 1 ) = H ( p ) I ( X 2 ; Y ∣ X 1 ) = I ( X 2 ; Y ) = H ( X 2 ) − H ( X 2 ∣ Y ) = H ( X 2 ) − p ( Y = 0 ) H ( X 2 ∣ Y = 0 ) = ( 1 − p ) H ( q 1 , q 2 , q 3 , q 4 ) I ( X 1 , X 2 ; Y ) = H ( Y ) = H ( p , ( 1 − p ) q 1 , ⋯   , ( 1 − p ) q 4 ) = I ( X 2 ; Y ∣ X 1 ) + I ( X 1 ; Y ) = H ( p ) + ( 1 − p ) H ( q 1 , q 2 , q 3 , q 4 ) \begin{aligned} I(X_1;Y|X_2)&=I(X_1;Y)=H(X_1)-H(X_1|Y)\\ &=H(X_1)=H(p)\\ I(X_2;Y|X_1)&=I(X_2;Y)=H(X_2)-H(X_2|Y)\\ &=H(X_2)-p(Y=0)H(X_2|Y=0)\\ &=(1-p)H(q_1,q_2,q_3,q_4)\\ I(X_1,X_2;Y)&=H(Y)=H(p,(1-p)q_1,\cdots,(1-p)q_4)\\ &=I(X_2;Y|X_1)+I(X_1;Y)=H(p)+(1-p)H(q_1,q_2,q_3,q_4) \end{aligned} I(X1;YX2)I(X2;YX1)I(X1,X2;Y)=I(X1;Y)=H(X1)H(X1Y)=H(X1)=H(p)=I(X2;Y)=H(X2)H(X2Y)=H(X2)p(Y=0)H(X2Y=0)=(1p)H(q1,q2,q3,q4)=H(Y)=H(p,(1p)q1,,(1p)q4)=I(X2;YX1)+I(X1;Y)=H(p)+(1p)H(q1,q2,q3,q4)
Maximized when q 1 = ⋯ = q 4 , H ( 1 4 , 1 4 , 1 4 , 1 4 ) = 2 q_1=\cdots=q_4,H(\frac{1}{4},\frac{1}{4},\frac{1}{4},\frac{1}{4})=2 q1==q4,H(41,41,41,41)=2. (Why maximize? Convex hull. Find the farthest boundary)

在这里插入图片描述

在这里插入图片描述

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值