REVISITING SHANNON ENTROPY
We
define thannon entropy for a random variable X, which takes the values
EUsU2, 2. ...
Only probability
with 5 pc, P2,ps...., pay as
H(X) =
-
E pologe p0
Shannon
entropy quantifies the amountof information transmit or
needed to
store information.
ALICE -> xi -> BOB
X San, x2,2ssuring
=
occurs with [pc, p2,ps, pa
probability
How
many
bits
does this need? The she
answer is needs H(x) bits. Aquick
visual check tells us that ideally Alice would have needed a bits to encode
we
the four values of Ur,12,x, and Mr. How? -2 bits will
give
her 22:4
possible (00,01, 10, 113
values which she can use for [x, 12, 13, 203
ButAlice can
getaway using only H(X) bits, where H(X) =-Epilogzpi, where on the
with
probability of xe: occurring. This is enstined in Shannon'snoiseless coding theorem.
example in class, where [p,, p2,Ps, p43 =3 =, t, f #3.
Letus look atthe
those encodes x(p2 =") with a
single bit 0 and
U2(P2=44) with
two bits
10 and
similarly is and my (Ps=pn=1/8) with three botsas 110 and 111.
why does she do
this?? make sure Bob
She needs to can
identify all four hi's.
she chooses
If for x and x2, she will
O and 1 be not left with
any two bot
and M2
message that can distinguish as and say from me,
for 0,22 1,13 10,xn 11, then it
example if the chooses: 1 is
= = =
=
possible Bob will
confuse us with the message C241. You get the drift.
N
3
is
x1 - 0 If Bob
the firstbit
gets is 1, he knows itis not
by, else it s
12 - 18 second fit
28 - 110
is also 1, he knows it
is not
is, and v2, else it as
is
third bit is also 1, he knows it is not
sy, he and cez, else it
is
xy - 111 us
So, now what
isthe
average size of
her
message
·
12 +1
14x2 18x3 48x
+
+
+
3 7/472
=
( AWecan
with than
less
ONANAMAGEget
2 bits away)
Ah, H(X) -
=
b(*2) "by(44) - -
180(48) 48u(*8) 7/4-2
- =
We
define thannon entropy for a random variable X, which takes the values
EUsU2, 2. ...
Only probability
with 5 pc, P2,ps...., pay as
H(X) =
-
E pologe p0
Shannon
entropy quantifies the amountof information transmit or
needed to
store information.
ALICE -> xi -> BOB
X San, x2,2ssuring
=
occurs with [pc, p2,ps, pa
probability
How
many
bits
does this need? The she
answer is needs H(x) bits. Aquick
visual check tells us that ideally Alice would have needed a bits to encode
we
the four values of Ur,12,x, and Mr. How? -2 bits will
give
her 22:4
possible (00,01, 10, 113
values which she can use for [x, 12, 13, 203
ButAlice can
getaway using only H(X) bits, where H(X) =-Epilogzpi, where on the
with
probability of xe: occurring. This is enstined in Shannon'snoiseless coding theorem.
example in class, where [p,, p2,Ps, p43 =3 =, t, f #3.
Letus look atthe
those encodes x(p2 =") with a
single bit 0 and
U2(P2=44) with
two bits
10 and
similarly is and my (Ps=pn=1/8) with three botsas 110 and 111.
why does she do
this?? make sure Bob
She needs to can
identify all four hi's.
she chooses
If for x and x2, she will
O and 1 be not left with
any two bot
and M2
message that can distinguish as and say from me,
for 0,22 1,13 10,xn 11, then it
example if the chooses: 1 is
= = =
=
possible Bob will
confuse us with the message C241. You get the drift.
N
3
is
x1 - 0 If Bob
the firstbit
gets is 1, he knows itis not
by, else it s
12 - 18 second fit
28 - 110
is also 1, he knows it
is not
is, and v2, else it as
is
third bit is also 1, he knows it is not
sy, he and cez, else it
is
xy - 111 us
So, now what
isthe
average size of
her
message
·
12 +1
14x2 18x3 48x
+
+
+
3 7/472
=
( AWecan
with than
less
ONANAMAGEget
2 bits away)
Ah, H(X) -
=
b(*2) "by(44) - -
180(48) 48u(*8) 7/4-2
- =