Concept:
If a message source produces xi symbols independently with probability mass function P(X=xi) then the entropy of the source is defined as:
\(H = \mathop \sum \limits_i P\left( {X = {x_i}} \right){\log _2}\frac{1}{{P\left( {X = {x_i}} \right)}}\) bits
The Nyquist sampling rate is defined as twice the maximum signal frequency, i.e.
N.R. = 2 × fmax
Also, the Information rate is defined as:
\(I = \left( {H \times Symbol\;rate} \right)\) bits/sec
Calculation:
Entropy will be:
\(H = \frac{1}{2}{\log _2}2 + \frac{1}{4}{\log _2}4 + \frac{1}{8}{\log _2}8 + \ldots + \frac{1}{{{2^{N - 1}}}}{\log _2}{2^{N - 1}}\)
\(H = \frac{1}{2} + \frac{2}{4} + \frac{3}{8} + \ldots \frac{{N - 2}}{{{2^{N - 2}}}} + \frac{{N - 1}}{{{2^{N - 1}}}} + \frac{{N - 1}}{{{2^{N - 1}}}}\) ---(1)
Divide the above equation by 2, we get:
\(\frac{H}{2} = \frac{1}{4} + \frac{2}{8} + \frac{3}{{16}} + \ldots \frac{{N - 2}}{{{2^{N - 1}}}} + \frac{{N - 1}}{{{2^N}}} + \frac{{N - 1}}{{{2^N}}}\) ---(2)
Subtract equation (2) from (1), we get:
\(\frac{H}{2} = \frac{1}{2} + \frac{1}{4} + \frac{1}{8} + \frac{1}{{16}} + \ldots + \frac{1}{{{2^{N - 1}}}} + \left\{ {\frac{{N - 1}}{{{2^{N - 1}}}} - \frac{{N - 1}}{{{2^N}}} - \frac{{N - 1}}{{{2^N}}}} \right\}\)
\(H = 1 + \frac{1}{2} + \frac{1}{4} + \frac{1}{8} + \ldots + \frac{1}{{{2^{N - 2}}}} + \left\{ {\frac{{N - 1}}{{{2^{N - 2}}}} - \frac{{N - 1}}{{{2^{N - 1}}}} - \frac{{N - 1}}{{{2^{N - 1}}}}} \right\}\)
\( = 1 + \frac{1}{2} + \frac{1}{4} + \frac{1}{8} + \ldots + \frac{1}{{{2^{N - 2}}}}\)
\(H = \mathop \sum \limits_{n = 0}^{N - 2} \frac{1}{{{2^n}}} = \frac{{1 - {{\left( {\frac{1}{2}} \right)}^{N - 1}}}}{{1 - \frac{1}{2}}}\)
\( = 2 \times \left( {1 - {{\left( {\frac{1}{2}} \right)}^{N - 1}}} \right) = \frac{{{2^{N - 1}} - 1}}{{{2^{N - 2}}}}\)
The maximum frequency of Voice signal = 3.5 kHz
The Nyquist sampling rate will be:
NR = 2 × 3.5 kHz
NR = 7 kHz
Sampling frequency will be:
\({f_s} = 2 \times 7 = 14\;Kb/s\)
The information rate will now be:
\(I = \left( {\frac{{{2^{N - 1}} - 1}}{{{2^{N - 2}}}} \times 14} \right)\) Kbps
For N = 8, the information rate will be:
\(I = \left( {\frac{{{2^7} - 1}}{{{2^5}}} \times 14} \right) \approx 27.78\;\)Kbits/sec
Alternate Method:
\(H = \frac{1}{2}{\log _2}2 + \frac{1}{4}{\log _2}4 + \frac{1}{8}{\log _2}8 + \ldots + \frac{1}{{{2^{N - 1}}}}{\log _2}{2^{N - 1}}\)
\(H = \mathop \sum \nolimits_{n = 1}^{N - 1} \frac{n}{{{2^n}}} + \frac{{N - 1}}{{{2^{N - 1}}}}\) ---(1)
\(\frac{H}{2} = \mathop \sum \nolimits_{n = 1}^{N - 1} \frac{n}{{{2^{n + 1}}}} + \frac{{N - 1}}{{{2^N}}}\) ---(2)
Subtracting (2) from (1), we get:
\(\frac{H}{2} = \left( {\frac{1}{2} + \mathop \sum \limits_{n = 2}^{N - 3} \frac{1}{{{2^n}}}} \right)\;\; + \left\{ {\frac{{N - 1}}{{{2^{N - 1}}}} - \frac{{N - 1}}{{{2^N}}} - \frac{{N - 1}}{{{2^N}}}} \right\}\)
\( = \mathop \sum \limits_{n = 1}^{N - 3} \frac{1}{{{2^n}}}\; + 0\)
\(H = \mathop \sum \limits_{n = 1}^{N - 3} \frac{1}{{{2^{n - 1}}}}\)
\(H = \mathop \sum \limits_{n = 0}^{N - 2} \frac{1}{{{2^n}}} = \frac{{1 - {{\left( {\frac{1}{2}} \right)}^{N - 1}}}}{{1 - \frac{1}{2}}} = 2 \times \left( {1 - {{\left( {\frac{1}{2}} \right)}^{N - 1}}} \right)\)
\(H = \frac{{{2^{N - 1}} - 1}}{{{2^{N - 2}}}}\)