You are on page 1of 1

Q.

No Questions

1. Identify the entropy of the system for an event that has six possible outcomes with probabilities
1/2,1/4,1/8,1/16,1/32?
2. How would you organize the formula to show mutual information?
3. What is mutual information and state its properties?
4. State in your own words about the Shannon’s theorem on information capacity of a channel.
5. What approach would you use to differentiate lossy source coding from lossless source coding
6. Can you formulate a theory for source coding theorem?
7. Identify the properties of line coding.

8. How would you describe the channel capacity of a discrete memory less channel?
9. Analyse the function of vertical redundancy checking.
10. Explain Shannon Hartley law of channel capacity
11. Distinguish PCM and DPCM.
12. State entropy. What is the main idea of Information rate?
13. Illustrate Bandwidth efficiency.
14. Construct BPSK and QPSK waveform for input data 11100011.
15. Report Error Control Codes and its applications in detail.
16. Analyse the function of vertical redundancy checking.
17. Analyse the function of vertical redundancy checking.
18. Show the expression for channel capacity of a continuous channel. Comment on the trade-off
between SNR and capacity.
19. What can you say about slope overload and granular noise?
20. Write and explain Shannon’s equation
1. A source generates five messages m0,m1,m2,m3 and m4 with probabilities 0.55,0.15,0.15,0.10
and 0.05 respectively. The successive messages emitted by the source are statistically independent.
Identify the code words for the messages and efficiency using Shannon Fano Algorithm.
2. Write short note on QPSK.
3. a. Write the Huffman code for a discrete memoryless source with probability statistics
{0.1,0.1,0.2,0.2,0.4}.
b. Describe the concept of Channel coding theorem.
4. A discrete source emits one of five symbols once every millisecond with probabilities 1/2,
1/4, 1/8, 1/16 and respectively. Determine the source entropy and information rate.

5. A DMS X has four symbols x1,x2,x3,x4 with P(x1) = 1/2, P(x2) = 1/4, and P(x3) = P(x4) =
1/8. Construct a Shannon-Fano code for X.
6. Construct ASK and FSK waveforms for a) 10010011 b)11000101
7. Proof that H(X)-H(X/Y)=H(Y)-H(Y/X)
8. Verify the following expression. I(xi,xj) =I(xi)+I(xj) if (xi) and (xj) are independent.

You might also like