1) Describe the importance of entropy H(X/Y) of a communication system where X is a transmitter and Y is a receiver.
2) The vent has six possible outcomes with probabilities 1/2.1/4,1/8,1/16,1/32,1/32. Determine the entropy of the system.
3) describe Source coding theorem in detail, prepare down the advantage and disadvantage of channel coding and describe the data compaction.
4) Describe the properties of entropy and with appropriate ex; describe the entropy of binary memory less source.
5) Five symbols of alphabet of discrete memory less source and their probabilities are given below. S=[S0,S1,S2,S3]; P[S]=[.4,.2,.2,.1,.1].Encode symbols by using Huffman coding.
6) prepare detailed notes on Differential entropy. Deduce the channel capacity theorem and describe the implications of information capacity theorem.
7) What do you understand by binary symmetric channel? Deduce channel capacity formula for symmetric channel.
8) Construct binary optical code for probability symbols given below by using Huffman procedure and compute entropy of source, average code Length, efficiency, redundancy and variance? 0.2, 0.18, 0.12, 0.1, 0.1, 0.08, 0.06, 0.06, 0.06, 0.04.
9) Specify and prove continuous channel capacity theorem.
10) Encode source given below by using Shannon-Fano and Huffman coding procedures. Compare the results.
X X1 X2 X3 X4 X5
P(X) 0.3 0.1 0.4 0.08 0.12
11)a) Encode source given below byusing Shannon-Fano coding:
X X1 X2 X3 X4 X5 X6 X7
P(X) 0.4 0.2 0.12 0.08 0.08 0.08 0.04
(b) Describe Huffman coding algorithm with suitable exand compare this with other types of coding in detail.