Professional Documents
Culture Documents
Problem 3: Entropy
Let p(x, y) be given by the following figure
Y
0
1/8
1/4
1/4
1/4
1/8
Find
H(X), H(Y ), H(X|Y ), H(Y |X), H(X, Y ), H(Y ) H(Y |X)
Draw a Venn diagram for the quantities you found.
H(X) = P (X = 0)ldP (X = 0) P (X = 1)ldP (X = 1) P (X = 2)ldP (X = 2)
= (P (X = 0, Y = 0) + P (X = 0, Y = 2))ld(P (X = 0, Y = 0) + P (X = 0, Y =
2)) P (X = 1, Y = 1)ldP (X = 1, Y = 1) (P (X = 2, Y = 0) + P (X = 2, Y =
2))ld(P (X = 2, Y = 0) + P (X = 2, Y = 2)))
= (3/8)ld(3/8) (1/4)ld(1/4) (3/8)ld(3/8)
= 11/4 9ld3
4
As the given diagram is symetric, which means we can get the same result of H(Y) easily
by changing the position of X and Y in the above fomular.
H(Y ) = H(X) = 11/4 9ld3
4
There are 5 probable positions for point P, so
H(X, Y ) = 3 14 ld4 + 2 18 ld8 = 94
As H(X) = H(Y ), H(X|Y ) = H(Y |X) = H(X, Y ) H(X) = 9ld3
12
4
H(Y ) H(Y |X) = 11/4 9ld3
( 9ld3
12 ) = 13
9ld3
4
4
4
2
yaxis
xaxis
Problem 7: Entropy
Z
X1
X2
Y = (X1 Z) X2 .
(Q1 ) Compute H(X2 |Y ) and H(Y ).
(Q2 ) Let p(X1 = 1) = p(X1 = 0) = p(X2 = 0) = p(X2 = 1) = 1/2. Calculate the mutual
information I(X2 ; Y ).
Problem 8: Entropy
Consider the following transmission channel
Y
X
BSC
Observer
A binary symmetric channel (BSC) with crossover probability p has input X and
output Y . The input X = 0 is used with probability q. The observer indicates Z = 0
whenever X = Y and Z = 1 otherwise.
(Q1 ) What is the uncertainty H(Z) in the observer output?
(Q2 ) What is the capacity and capacity achieving input distribution if the receiver is
provided with both Y and Z?
0
1/4
X
1/2
1/2
1. Compute H(X|Y ).
2. Calculate the error probability Pe .
3. Compute the Fano inequality for H(X|Y ).
Problem 10: Connection probabilities
Consider the relation between height L and weight W shown in the figure, indicating
tall people tend to be heavier than short people.
Weight(W)
Height (L)
00
11
11
00
00000000000000
11111111111111
000000000000000
111111111111111
00000000000000
11111111111111
000000000000000
111111111111111
00000000000000
11111111111111
000000000000000
111111111111111
0100000000000000
11111111111111
0000000000000001
11111111111111
000000000000000
111111111111111
000000000000000
111111111111111
000000000000000
111111111111111
0111111111111111
01
000000000000000
111111111111111
00000000000000
000000000000000
111111111111111
00000000000000
101011111111111111
00
11
00000000000000
11111111111111
000000000000000
111111111111111
00
11
00000000000000
11111111111111
000000000000000
111111111111111
00000000000000
11111111111111
000000000000000
111111111111111
11111111111111
00000000000000
00000000000000
000000000000000
111111111111111
0111111111111111
11
00
1/2
1/2
1/4
Tall 1/4
1/2
Very heavy
Heavy
1/4
1/4
Average 1/4
1/2
Average
1/4
1/4
Short 1/4
1/2
Light
1/4
1/2
1/2
Very light
(0,0,1)
(0,0,0)
C
Y
(0,1,0)
A
(1,0,0)
X
Given the following channel with two inputs X1 and X2 and the
X1 : {0, 1}
X1
X2
X2 : {0, 1}
Y : {0, 1, 2}
+ = real addition
* = real multiplication
Also we have
P r(X1 = 1) = 1 P r(X1 = 0) = p1 , 0 p1 1
P r(X2 = 1) = 1 P r(X2 = 0) = p2 , 0 p2 1.
(a) Compute H(Y ), H(Y |X1), H(Y |X2), I(X1; Y |X2) in bits.
(b) Determine the input probabilities for X1 and X2 that maximizes H(Y ).
Problem 16: Typical sequences
An information source produces independent binary symbols with p(0) = p and p(1) =
1-p with p > 0.5 and an information sequence of 16 binary symbols. A typical sequence
is defined to have two or fewer symbols of 1.
(Q1 ) What is the most probable sequence that can be generated by this source and what
is its probability?
(Q2 ) What is the number of typical sequences that can be generated by this source?
We assign a unique binary codeword for each typical sequence and neglect the nontypical sequences.
(Q3 ) If the assigned codewords are all of the same length, find the minimum codeword
length required to provide the above set with distinct codewords.
(Q4 ) Determine the probability that a sequence is not assigned with a codeword.
1/2
X
Y
1/2
1
2
1
Channel2:
1
0
0
1/2
X
1
1/2
1/2
X
1/2
2
2
p
DMC1
DMC2
Y
1
Y
1
0
X
0
0 Y
Y
1
0 Y
X
2
2
3
10
side information
Z : {1,0,1}
p(z=1)=p(z=0)=p(z=1)=1/3
X : {0,1}
Y=X+Z
receiver
transmitter
channel
real addition
(Q1 ) If the receiver uses side information, i.e. the absolute value of Z, what is the capacity C1 of the channel in bits per transmission?
(Q2 ) If the receiver can not access side information, i.e. the receiver does not know the
absolute value of Z, what is the capacity C2 of the channel in bits per transmission?
(Q3 ) Now, let the transmitter change its alphabet to {0, 2}. Determine again C1 and C2
in this case.
Problem 21: Huffman Code
Consider a random source with statistically independent source symbols qi , 1 i 8.
The distribution of the source is given as follows:
Q
q1
p(q) 0.5
q2 q3 q4
0.1 0.1 0.1
q5
q6
q7
q8
0.1 0.05 0.025 0.025
a) Determine the entropy of the source and compare the result to a source with eight
identically distributed symbols. (Hint: ld10 3.32).
b) Construct an optimal binary prefix-free code for the given source.
c) Determine the average code word length of the constructed code by means of path
length lemma. Compare the result to the entropy.
d) Determine the sequence of code bits for the following sequence of source symbols:
q = [q1 q4 q8 q6 q1 q7 ].
e) Determine the code word length for the given sequence. Compare the result to the
average code word length.
11
Fi =
i1
X
pi ,
(2)
k=1
the sum of the probabilities of all symbols less than i. Then the codeword for i is the
number Fi [0, 1] rounded off to li bits, where li = log p1i .
a) Show that the code constructed by this process is prefix-free and the average length
satisfies
H(X) L H(X) + 1.
(3)
b) Construct the code for the probability distribution (0.5, 0.25, 0.125, 0.125).
Problem 24: Enumerative Coding
Let S be a set of binary sequences of length 13 with 3 ones. What is the sequence for
index 95 in the lower lexicographical ordering of S? Hint: Apply Enumerative decoding.
12
Assume that the source X is followed by a quantizer which uses four level of quantization
given as
quantized value
0.45
0.7
1.15
1.8
interval
0 < x 0.5
0.5 < x 1
1 < x 1.5
1.5 < x 2
b
b
b
b
g
g
o
o
g
g
o
o
where b represents blue, g represents green and o represents orange. Use the
transform matrix T given as:
1 1
1
1
1 1
1 1 1
2 1
1 1 1
1 1 1 1
Use the same quantization levels given in the slide and show all your steps.
Problem 27: Error Detection
A binary code has block length 6 and given as:
A: 000000
B: 001111
C: 111100
D: 111111
The information is transmitted over a binary symmetric channel with cross-over probability given as p. Calculate the probability of a detection error for A, B, C, and D.
Problem 28: Data Reduction
Check the slide number 17 in chapter error detection. Why is the number 1s in C(x)
is even?
13
q2 q3 q4
0.1 0.1 0.1
q5
q6
q7
q8
0.1 0.05 0.025 0.025
a) Determine the entropy of the source and compare the result to a source with eight
identically distributed symbols. (Hint: ld10 3.32).
b) Construct an optimal binary prefix-free code for the given source.
c) Determine the average code word length of the constructed code by means of path
length lemma. Compare the result to the entropy.
d) Determine the sequence of code bits for the following sequence of source symbols:
q = [q1 q4 q8 q6 q1 q7 ].
e) Determine the code word length for the given sequence. Compare the result to the
average code word length.
Problem 31: Huffman Code
Let Q denote a source with the following distribution:
Q
q1
p(q) 0.3
q2 q3 q4
0.2 0.1 0.1
q5 q6 q7
0.1 0.1 0.1
The coded bits are transmitted error-free. Recover the original sequence of source
bits back from the sequence of code bits.
Introduce a bit error in the code bit sequence by flipping the 5th code bit. Decode
the resulting (erroneous) code bit sequence.
Problem 1: Capacity
Given the following channel.
| Z | :{0,1}
noise
side information
Z : {1,0,1}
p(z=1)=p(z=0)=p(z=1)=1/3
X : {0,1}
Y=X+Z
receiver
transmitter
channel
real addition
(Q1 ) If the receiver uses side information, i.e. the absolute value of Z, what is the capacity C1 of the channel in bits per transmission?
(Q2 ) If the receiver can not access side information, i.e. the receiver does not know the
absolute value of Z, what is the capacity C2 of the channel in bits per transmission?
(Q3 ) Now, let the transmitter change its alphabet to {0, 2}. Determine again C1 and C2
in this case.
15