& CODING
7/30/2022
, Module 2-Channels and Channel Coding
• Discrete memoryless channels. Capacity of discrete memoryless
channels. Binary symmetric channels (BSC), Binary Erasure
channels (BEC). Capacity of BSC and BEC. Channel code. Rate of
channel code. Shannon‟s channel coding theorem (both
achievability and converse without proof) and operational meaning
of channel capacity.
• Modeling of Additive White Gaussian channels. Continuous-input
channels with average power constraint. Differential entropy.
Differential Entropy of Gaussian random variable. Relation
between differential entropy and entropy. Shannon-Hartley theorem
(with proof – mathematical subtlities regarding power constraint
may be overlooked).
• Inferences from Shannon Hartley theorem – spectral efficiency
versus SNR per bit, power-limited and bandwidth-limited regions,
Shannon limit, Ultimate Shannon limit.
7/30/2022
, RATE OF INFORMATION TRANSMISSION OVER A
DISCRETE CHANNEL
• We have the entropy of the input symbols given by
• Consider a discrete memoryless channel accepting symbols at
the rate of , rs message symbols/sec .
• The average rate at which information is going into the
channel is given by
7/30/2022
, • At the receiver, it is not possible to reconstruct the input
symbol sequence with certainty by operating on the receiving
sequence.
• This is due to errors introduced when the signals pass through
the channel.
• Some amount of information is lost in the channel due to
noise.
• This information which is lost in the channel has been called
equivocation H(X/Y). Hence the net amount of information
which is the mutual information I(X;Y) , is given by equation
as
7/30/2022