Professional Documents
Culture Documents
SYLLABUS
UNIT I INFORMATION THEORY Information Entropy, Information rate, classification of codes, Kraft McMillan inequality, Source coding theorem, ShannonFano coding, Huffman coding, Extended Huffman coding - Joint and conditional entropies, Mutual information - Discrete memoryless channels BSC, BEC Channel capacity, Shannon limit. UNIT II SOURCE CODING: TEXT, AUDIO AND SPEECH
Text: Adaptive Huffman Coding, Arithmetic Coding, LZW algorithm Audio: Perceptual coding, Masking techniques, Psychoacoustic model, MEG Audio layers I,II,III, Dolby AC3 Speech: Channel Vocoder, Linear Predictive Coding
UNIT III
Image and Video Formats GIF, TIFF, SIF, CIF, QCIF Image compression: READ, JPEG Video Compression: Principles-I,B,P frames, Motion estimation, Motion compensation, H.261, MPEG standard UNIT IV ERROR CONTROL CODING: BLOCK CODES
Definitions and Principles: Hamming weight, Hamming distance, Minimum distance decoding - Single parity codes, Hamming codes, Repetition codes - Linear block codes, Cyclic codes - Syndrome calculation, Encoder and decoder - CRC UNIT V -ERROR CONTROL CODING: CONVOLUTIONAL CODES
Convolutional codes code tree, trellis, state diagram - Encoding Decoding: Sequential search and Viterbi algorithm Principle of Turbo coding
REFERENCE BOOKS
TEXT BOOKS:
R Bose, Information Theory, Coding and Cryptography, TMH 2007 Fred Halsall, Multimedia Communications: Applications, Networks, Protocols and Standards, Pearson Education Asia, 2002
REFERENCES: K Sayood, Introduction to Data Compression 3/e, Elsevier 2006 S Gravano, Introduction to Error Control Codes, Oxford University Press 2007 Amitabha Bhattacharya, Digital Communication, TMH 2006
Contents
Information Entropy, Information rate, classification of codes, Kraft McMillan inequality, Source coding theorem, Shannon-Fano coding, Huffman coding, Extended Huffman coding Joint and conditional entropies, Mutual information Discrete memoryless channels BSC, BEC Channel capacity, Shannon limit.
Communication system
Information
is closely related to uncertainty or surprise. When message from source known->No surprise No information Probability is low more surprise more information. Amount of information is inverse of probability of occurrence
What
is information theory ?
Information theory is needed to enable the communication system to carry information (signals) from sender to receiver over a communication channel it deals with mathematical modelling and analysis of a communication system its major task is to answer to the questions of signal compression and transfer rate Those answers can be found and solved by entropy and channel capacity
uncertainty. When the event X= X i occurs, amount of surprise. After the occurrence of X= X i ,gain in amount of information. Amount of information is related to inverse of probability of occurrence.
Entropy
Property of entropy
Entropy is bounded by
0 H(X) log2 K
The entropy is maximum with uniform distribution and minimum when there is only one possible value.
Two types of coding 1)Fixed length code 2)Variable length code (Morse code) In morse code, letters and alphabets are encoded as dots. and dashes- Short code frequently occurring source symbol (e) Long code rare source symbol (q) Efficient source should satisfy 2 condition
i. ii. Code word produce by the encoder are in binary form The source code should be uniquely decodable.
Data Compaction
Data compaction (lossless data compression) means that we will remove redundant information from the signal prior the transmission
basically this is achieved by assigning short descriptions to the most frequent outcomes of the source output and vice versa
Source-coding schemes that are used in data compaction are e.g. prefix coding, huffman coding, lempelziv,shano-fano.
Prefix Coding
Huffman Coding
Contd.,
Entropy
Contd.,
Conditional entropy (equivocation)amount of uncertainty remaining about the channel input after the channel output is observed. Marginal probability distribution of o/p random variable Y is obtained by averaging out dependence of on
BSC.,
Conditional probability of error channel capacity is
C varies with probability of error in convex manner ,which is symmetric about p=1/2.
Channel noise free, set p=0 => C attains maximum value of one bit per channel use.At this value H(p) attains min value.
When error p=1/2, => C attains maximum value of zero,whereas entropy H(p) attains max value of unity, and channel is said to be useless.
Mutual information
Mutual information
Symmetric
Non negative Mutual information of channel is related to joint entropy of channel input and channel output by I(X,Y) = H(X) + H(Y) H(X,Y)
Channel Capacity
H ( S ) I ( X ,Y )Max Ts Tc
There exists a coding scheme for which the source output can be transmitted over the channel and be reconstructed with an arbitrarily small probability of error. The parameter C/Tc is called the critical rate. When this condition is satisfied with the equality sign, the system is said to be signaling at the critical rate.
Conversely, if,
H ( S ) I ( X ,Y )Max Ts Tc
it is not possible to transmit information over the channel and reconstruct it with an arbitrarily small probability of error