problem 1)a) What do you mean by information? What are its units? How does it relate to the entropy?
b) Suppose we have ten messages of probabilities: P(m_{1}) = 0.49, P(m_{2}) = 0.14,P(m_{3}) = 0.14, P(m_{4}) = 0.07, P(m_{5}) = 0.07, P(m_{6})=.04, P(m7) = 0.02, P(m_{8}) = 0.02, P(m_{9}) = 0.005, P(m_{l0}) = 0.005. Determine the Shanon Fano code for the set of messages. Determine coding efficiency and redundancy.
problem 2) A transmitter has the alphabet consisting of five letters (x1 x2, x3, x4, x5) and receiver has the alphabet of four letters (y1, y2, y3, y4). The probabilities are P (x_{1}) = 0.2 5, P(x_{2}) = 0. 4, P(x_{3}) = 0.1 5, P(x_{4}) = 0.15, P(x_{5}) = 0.05 and
P(Y/X) =
Y_{1} Y_{2 } y_{3} y_{4}
X_{1} 1 0 0 0
X_{2} 0.25 0.75 0 0
X_{3 } 0 1/3 2/3 0
X_{4} 0 0 1/3 2/3
X_{5 } 0 0 1 0
Compute entropies {H(Y), H(X, Y), H(X), H(X/Y), H(Y/X)} and I(X, Y).
problem 3)a) State and describe Sampling theorem in detail. describe how is it useful in communication systems?
b) describe Shanon’s Hartley theorem based on channel capacity. How does channel capacity modify if bandwidth is increased to infinity? describe the orthogonal signalling performance on the basis of theorems.
problem 4) Describe the significance of the following:
a) Companding.
b) Prediction filters.
c) Adaptive filters.
d) Equalization