, SOLUTIONS FOR CHAPTER 1
Q.1.1 DMS with source probabilities : {0.30 0.25 0.20 0.15 0.10}
1
Entropy H(X) = p i log
i pi
= 0.30 log 1/0.30 + 0.25 log 1/0.25 + ……………
= 2.228 bits
qi
Q.1.2 Define D(p q) =
p i
log (1)
i pi
pi, qi – probability distributions of discrete source X.
q qi
D(p q) = p i log pi P i− 1 [using identity ln x x – 1]
i i i pi
= (q i − pi ) = 0
i
D(p q) 0
Put qi = 1/n in (1) where n = cardinality of the distance source.
D(p q) = p log p
i
i i + p log n
i
i
p log pi i + log n = − H ( X ) + log n 0
= i
− H ( X ) log n
H(X) = log n for uniform probability distribution. Hence proved that entropy of a
discrete source is maximum when output symbols are equally probable. The
quantity D(p q) is called the Kullback-Leibler Distance.
Q.1.3 The plots are given below:
2
,"Copyrighted Material" -"Additional resource material supplied with the book Information, Theory, Coding an3d
Cryptography, Second Edition written by Ranjan Bose & published by McGraw-Hill Education (India) Pvt. Ltd. This
resource material is for Instructor's use only."
3
2
y = x -1
1
y = ln(x)
0
-1
-2
-3
0 0.5 1 1.5 2 2.5 3 3.5
Q 1.4 Consider two probability distributions: {p0, p1, , pK-1} and {q0, q1, , qK-1}.
K −1 qk 1 K −1 qk
We have pk log2 = pk ln p . Use ln x 1 – x,
p ln 2 k =0
k =0 k k
qk 1 K −1 p qk
pk log2 p ln 2 k p −1
K −1
k =0 k k =0 k
K −1
1
(qk − pk )
ln 2 k=0
1 K −1
q k − pk = 0
K −1
ln 2 k =0 k =0
K −1 qk
Thus, pk log2 0 . (1)
p
k =0 k
n m P(xi , y j )
Now, I(X; Y) = P( xi , y j )log P(x )P( y ) (2)
i=1 j =1 i j
From (1) and (2) we can conclude (after basic manipulations) that I(X;Y) 0. The
equality holds if and only if P(xi , x j ) = P(xi )P(x j ) , i.e., when the input and output
symbols of the channel are statistically independent.
3
, "Copyrighted Material" -"Additional resource material supplied with the book Information, Theory, Coding an4d
Cryptography, Second Edition written by Ranjan Bose & published by McGraw-Hill Education (India) Pvt. Ltd. This
resource material is for Instructor's use only."
Q.1.5 Source X has infinitely large set of outputs P(xi) = 2-i, i = 1, 2, 3, ………
k k k k k k k k k k k k k k k k
1
H(X) = p(x ) log = 2−i log 2−i
k k k
k k k k k k k k k k k
k k
i
i=1 p(xi i=1
) k
= k
i=1
i .2 −i k k = 2bits k k
Q.1.6 Given: P(xi) = p(1-p)i-1 k k k i = 1, 2, 3, ……..
k k k k k
H(X)= −p(1− p)i−1 logp(1− p)i−1 i
k k k k k k
k
k k k k k k
= −p(1− p)i−1 log p + (i−1) log(1− p)
k k
k
k k k k k k k k k k k
i
= − plog pp(1− p)i−1 − plog(1− p) (i−1) (1− p)i−1
k k k k k
k
k k k k k k k k k k k k k
i=1 i
1− p 1
= − plog p − plog(1− p)
k k k
k k k k k k k k k
p p2 k
1− p plog p − (1− p)log(1− p) k k k k k k k k k k k k k 1
= −log p −
k log(1− p) =
k k k k k k k = k k H(p) bits
k k k
p p k k
p
Q 1.7 Hint: Same approach as the previous two problems.
k k k k k k k k k
Q 1.8 Yes it is uniquely decodable code because each symbol is coded uniquely.
k k k k k k k k k k k k k
Q 1.9 The relative entropy or Kullback Leibler distance between two probability mass
k k k k k k k k k k k k
functions p(x) and q(x) is defined as
k k k k k k k
p(x) k k
D( p||q )= p(x)log . (1.76)
q(x)
k k k k k k k
xX k k
(i) Showthat D ( p ||q) is non negative. k k k k k k k
p(x) q(x) k k k k
Solution: − D ( p || q)= −p(x)log
k k k k k k
k
= p(x)log k k k k
k
k k k
q(x) p(x)
xX
k k
xX
logp(x)
q(x)
(from Jensen’s Inequality: Ef(X) f(EX))
k k
k k k k k k k
k
xX p(x)
= logq(x) = log(1) = 0.
k k
k
k k k k
xX
Thus, − D(p ||q) 0 or D ( p ||q) 0. k k k k k k k k k k k k k k
4