1. It may come as a surprise, but the number of bits needed to store text is much less than that required to store its spoken equivalent. Can you explain the reason for this statement?
2. Let a discrete random variable X assume values in the set {x1, x2, ¼, xn}. Show that the entropy of X satisfies the inequality
H (x ) £ log n
and with equality if, and only if, the probability pi = 1/n for all i.