You are on page 1of 21

Introduction to

Electronic Communications
is the transmission, reception, and processing
of information with the use of electronic
Digital Modulation
is the transmittal of digitally modulated
analog signals (carriers) between two or more
points in a communications system
sometimes called digital radio because
digitally modulated signals can be propagated
through Earths atmosphere and used in
wireless communications systems
Digital communications
Is the transmission of digital pulses between two or
more points in a communication system.
include systems where relatively high-frequency
analog carriers are modulated by relatively low
frequency digital information signals (digital radio)
and systems involving the transmission of digital
pulses (digital transmission)
was originally limited to the transmission of data
between computers
Basic elements of a digital communication system
Advantages Disadvantages

1. Noise Immunity 1. Bandwidth size

2. Error Detection 2. Complex
and Correction circuitry
3. Compatibility with
Simplified block diagram of a digital radio system
Information theory
is a highly theoretical study of the efficient
use of bandwidth to propagate information
through electronic communications systems
can be used to determine the information
capacity of a data communications system
Information capacity
is a measure of how much information can be
propagated through a communications
system and is a function of bandwidth and
transmission time.
represents the number of independent
symbols that can be carried through a system
in a given unit of time.
Binary digit or bit - the most basic digital
symbol used to represent information
A unit of information represented by either
a 1 or 0
Bit rate - simply the number of bits
transmitted during one second and is
expressed in bits per second(bps).
Information Theory
1. Information Measure
The information sent from a digital source when
the ith message of transmitter is given by
2. Average Information (Entropy)
- In general, the information content will vary
from message to message because the
probability of transmitting the nth message
will not be equal. Consequently, we need an
average information measure for the source,
considering all the possible message we can
3. Relative entropy
- The ratio of the entropy of a source to the
maximum value the entropy could take for
the same source symbol.
4. Redundancy

5. Rate of Information
A telephone touch-tone keypad has the digits
0 to 9, plus the * and # keys. Assume the
probability of sending * and # is 0.005 and the
probability of sending 0 to 9 is 0.099. If each
keys are pressed at a rate of 2 keys/s,
compute the entropy and data rate for this
From the given table

Determine (a) entropy; (b) relative entropy; (c)

rate of information
Hartleys law
R. Hartley of Bell Telephone Laboratories
(year 1928)

Where: I = information capacity (bps)

B = bandwidth (Hz)
t = transmission time (seconds)
Shannon limit for information capacity
Claude E. Shannon of Bell Telephone
Laboratories (year 1948)

Where: I = information
capacity (bps)
B = bandwidth (Hz)
S/N = signal-to-noise
power ratio (unitless)
M-ary Encoding
M-ary ----- is a term derived from the word
M - simply represents a digit that corresponds
to the number of conditions, levels, or
combinations possible for a given number of
binary variables.

Where: N = number of bits necessary

M = number of conditions,
levels, or combinations possible with N
Bit rate - refers to the rate of change of a
digital information signal, which is usually
Baud - like bit rate, is also a rate of change;
however, baud refers to the rate of change of
a signal on the transmission medium after
encoding and modulation have occurred.
is a unit of transmission rate, modulation
rate, or symbol rate

Where: ts = time of one

signaling element
1. What is the Shannon limit for information
capacity for a standard voice band
communications channel with a S/N ratio of
1000 (30 dB) and a bandwidth of 2.7 kHz?
2. Determine the channel capacity of a 4 kHz
channel with S/N = 10 dB.