You are on page 1of 65

Data Compression

 Process of encoding information using fewer bits than


the original representation
 Lossy -> reduces bits by removing unnecessary or less
important information
 used to compress multimedia data
 Lossless -> allows the original data to be perfectly
reconstructed from the compressed data
 used to compress executable programs, text documents, and
source code
Low compression (high quality) High compression(low quality)
Encoding and Compression of Data
 ASCII Standard
 ASCII stands for American Standard Code for
Information Interchange.
 Computers can only understand numbers, so an
ASCII code is the numerical representation of a
character such as 'a' or '@'
 Fixed Length encoding Scheme
Problem with Fixed Length
coding
 Suppose we have a message consisting of 5
symbols, e.g. [►♣♣♠☻►♣☼►☻]
 How can we code this message so the coded
message will have minimum length for
transmission or space saving?
Purpose of Huffman Coding
 Proposed by Dr. David A. Huffman in 1952
 “A Method for the Construction of Minimum
Redundancy Codes”
 Applicable to many forms of data transmission
 Our example: text files
Data compression
 Huffman encoding is a simple example of data
compression: representing data in fewer bits than it
would otherwise need
 A more sophisticated method is GIF (Graphics
Interchange Format) compression, for .gif files
 Another is JPEG (Joint Photographic Experts
Group), for .jpg files
 Unlike the others, JPEG is lossy—it loses information
 Generally OK for photographs (if you don’t compress
them too much), because decompression adds “fake”
data very similiar to the original
The Basic Algorithm
 Huffman coding is a form of statistical coding
 Not all characters occur with the same frequency!
 Yet all characters are allocated the same amount of space

 1 char = 1 byte, be it e or x
The Basic Algorithm
 Any savings in tailoring codes to frequency of
character?
 Code word lengths are no longer fixed like ASCII.
 Code word lengths vary and will be shorter for the
more frequently used characters.
The (Real) Basic Algorithm
1. Scan text to be compressed and count
occurrence of all characters.
2. Sort or prioritize characters based on
number of occurrences in text.
3. Build Huffman code tree based on
prioritized list.
4. Perform a traversal of tree to determine
all code words.
5. Scan text again and create new file
using the Huffman codes.
Building a Tree
Scan the original text
 Consider the following short text:

Eerie eyes seen near lake.

 Count up the occurrences of all characters in the text


Building a Tree
Scan the original text
Eerie eyes seen near lake.
 What characters are present?

E e r i space
y s n a l k .
Building a Tree
Scan the original text
Eerie eyes seen near lake.
 What is the frequency of each character in the text?

Char Freq. Char Freq. Char Freq.


E 1 y 1 k 1
e 8 s 2 . 1
r 2 n 2
i 1 a 2
space 4 l 1
Building a Tree
Prioritize characters
 Create binary tree nodes with character and
frequency of each character
 Place nodes in a priority queue
 The lower the occurrence, the higher the priority in
the queue
Building a Tree
 The queue after inserting all nodes

E i y l k . r s n a sp e
1 1 1 1 1 1 2 2 2 2 4 8
Building a Tree
 While priority queue contains two or more nodes
 Create new node
 Dequeue node and make it left subtree
 Dequeue next node and make it right subtree
 Frequency of new node equals sum of frequency of left and
right children
 Enqueue new node back into queue
Building a Tree
E i y l k . r s n a sp e
1 1 1 1 1 1 2 2 2 2 4 8
Building a Tree
y l k . r s n a sp e
1 1 1 1 2 2 2 2 4 8

E i
1 1
Building a Tree
y l k . r s n a sp e
2
1 1 1 1 2 2 2 2 4 8
E i
1 1
Building a Tree
k . r s n a sp e
2
1 1 2 2 2 2 4 8
E i
1 1

y l
1 1
Building a Tree
2
k . r s n a 2 sp e
1 1 2 2 2 2 4 8
y l
E i 1 1
1 1
Building a Tree
r s n a 2 2 sp e
2 2 2 2 4 8
y l
E i 1 1
1 1

k .
1 1
Building a Tree
r s n a 2 2 sp e
2
2 2 2 2 4 8
E i y l k .
1 1 1 1 1 1
Building a Tree
n a 2 sp e
2 2
2 2 4 8
E i y l k .
1 1 1 1 1 1

r s
2 2
Building a Tree
n a 2 sp e
2 2 4
2 2 4 8

E i y l k . r s
1 1 1 1 1 1 2 2
Building a Tree
2 4 e
2 2 sp
8
4
y l k . r s
E i 1 1 1 1 2 2
1 1

n a
2 2
Building a Tree
2 4 4 e
2 2 sp
8
4
y l k . r s n a
E i 1 1 1 1 2 2 2 2
1 1
Building a Tree
4 4 e
2 sp
8
4
k . r s n a
1 1 2 2 2 2

2 2

E i y l
1 1 1 1
Building a Tree
4 4 4
2 sp e
4 2 2 8
k . r s n a
1 1 2 2 2 2
E i y l
1 1 1 1
Building a Tree
4 4 4
e
2 2 8
r s n a
2 2 2 2
E i y l
1 1 1 1

2 sp
4
k .
1 1
Building a Tree
4 4 4 6 e
2 sp 8
r s n a 2 2
4
2 2 2 2 k .
E i y l 1 1
1 1 1 1

What is happening to the characters


with a low number of occurrences?
Building a Tree
4 6 e
2 2 2 8
sp
4
E i y l k .
1 1 1 1 1 1
8

4 4

r s n a
2 2 2 2
Building a Tree
4 6 e 8
2 2 2 8
sp
4 4 4
E i y l k .
1 1 1 1 1 1
r s n a
2 2 2 2
Building a Tree
8
e
8
4 4
10
r s n a
2 2 2 2 4
6
2 2 2 sp
4
E i y l k .
1 1 1 1 1 1
Building a Tree
8 10
e
8 4
4 4
6
2 2
r s n a 2 sp
2 2 2 2 4
E i y l k .
1 1 1 1 1 1
Building a Tree
10
16
4
6
2 2 e 8
2 sp 8
4
E i y l k . 4 4
1 1 1 1 1 1

r s n a
2 2 2 2
Building a Tree
10 16

4
6
e 8
2 2 8
2 sp
4 4 4
E i y l k .
1 1 1 1 1 1
r s n a
2 2 2 2
Building a Tree
26

16
10

4 e 8
6 8
2 2 2 sp 4 4
4
E i y l k .
1 1 1 1 1 1 r s n a
2 2 2 2
Building a Tree •After
enqueueing
26 this node
there is only
16
10 one node left
4 e 8
in priority
6 8 queue.
2 2 2 sp 4 4
4
E i y l k .
1 1 1 1 1 1 r s n a
2 2 2 2
Building a Tree
Dequeue the single node
left in the queue.
26

16
This tree contains the 10
new code words for each
4 e 8
character. 6 8
2 2 2 sp 4 4
4
Frequency of root node E i y l k .
1 1 1 1 1 1 r s n a
should equal number of 2 2 2 2
characters in text.
Eerie eyes seen near lake.  26 characters
Encoding the File
Traverse Tree for Codes

 Perform a traversal of the


tree to obtain new code
words 26

 Going left is a 0 going right 16


10
is a 1
 code word is only 4 e 8
6 8
completed when a leaf 2 2 2 sp 4 4
node is reached 4
E i y l k .
1 1 1 1 1 1 r s n a
2 2 2 2
Encoding the File
Traverse Tree for Codes
Char Code
E 0000
i 0001
y 0010 26
l 0011
k 0100 16
. 0101 10
space 011 4
e 10 e 8
6 8
r 1100 2 2 2 4 4
s 1101 sp
4
n 1110 E i y l k .
a 1111 1 1 1 1 1 1 r s n a
2 2 2 2
Encoding the File
 Rescan text and encode file Char Code
using new code words E 0000
i 0001
Eerie eyes seen near lake. y 0010
0000101100000110011 l 0011
k 0100
1000101011011010011 . 0101
1110101111110001100 space 011
1111110100100101 e 10
r 1100
 Why is there no need s 1101
for a separator n 1110
character? a 1111
.
Encoding the File
Results
 Have we made things any 0000101100000110011
better? 1000101011011010011
 73 bits to encode the text 1110101111110001100
 ASCII would take 8 * 26 = 1111110100100101
208 bits
Decoding the File
 How does receiver know what 26
the codes are?
16
 Tree constructed for each text 10
file. 4 e 8
 Considers frequency for each file 6 8
2 2 2
 Once receiver has tree it sp 4 4
4
scans incoming bit stream E i y l k .
r s n a
1 1 1 1 1 1
 0  go left E e r i e Sp
2 2 2 2

 1  go right 0000101100000110011
1000101011011010011
Practical considerations
 It is not practical to create a Huffman encoding for a
single short string, such as ABRACADABRA
 To decode it, you would need the code table
 If you include the code table in the entire message,
the whole thing is bigger than just the ASCII message
 Huffman encoding is practical if:
 The encoded string is large relative to the code table,
Example
 Assume that relative frequencies of letters occuring in a
certain text are:
 A: 40
 B: 20
 C: 10
 D: 10
 R: 20

 Write down the codes for the above mentioned alphabets using
huffman coding scheme.
 A=0
B = 100
C = 1010
D = 1011
R = 11
Huffman Code Construction
•Char •Freq
Character count in text.
•E •125
•T •93
•A •80
•O •76
•I •73
•N •71
•S •65
•R •61
•H •55
•L •41
•D •40
•C •31
•U •27

•50
•Char•Freq
•E •125
•T •93
•A •80
•O •76

Huffman Code Construction •I •73


•N •71
•S •65
•R •61
•H •55
•L •41
•D •40
•C •31
•U •27

•C •U
•31 •27
•51
•Char•Freq
•E •125
•T •93
•A •80
•O •76

Huffman Code Construction •I •73


•N •71
•S •65
•R •61
•58
•H •55
•L •41
•D •40

•C •31
•U •27

•58

•C •U
•31 •27
•52
•Char•Freq
•E •125
•T •93
•81
•A •80

Huffman Code Construction •O •76


•I •73
•N •71
•S •65
•R •61
•58
•H •55

•L •41
•D •40

•81

•D •L

•40 •41 •58

•C •U
•31 •27
•53
•Char•Freq
•E •125
•113
•T •93
•81

Huffman Code Construction •A •80


•O •76
•I •73
•N •71
•S •65
•R •61

•58
•H •55

•81 •113

•D •L •H

•40 •41 •58 •55

•C •U
•31 •27
•54
•Char•Freq
•126
•E •125
•113
•T •93

Huffman Code Construction •81


•A •80
•O •76
•I •73
•N •71

•S •65
•R •61

•81 •126 •113

•D •L •R •S •H

•40 •41 •61 •65 •58 •55

•C •U
•31 •27
•55
•Char•Freq
•144
•126
•E •125
•113

Huffman Code Construction •T •93


•81
•A •80
•O •76

•I •73
•N •71

•81 •126 •144 •113

•D •L •R •S •N •I •H

•40 •41 •61 •65 •71 •73 •58 •55

•C •U
•31 •27
•56
•Char•Freq
•156
•144
•126
•E •125

Huffman Code Construction •113


•T •93
•81

•A •80
•O •76

•156

•A •O

•80 •76 •81 •126 •144 •113

•D •L •R •S •N •I •H

•40 •41 •61 •65 •71 •73 •58 •55

•C •U
•31 •27
•57
•Char•Freq
•174
•156
•144
•126

Huffman Code Construction •E •125


•113

•T •93
•81

•156 •174

•A •O •T

•80 •76 •81 •93 •126 •144 •113

•D •L •R •S •N •I •H

•40 •41 •61 •65 •71 •73 •58 •55

•C •U
•31 •27
•58
•Char•Freq
•238
•174
•156
•144

Huffman Code Construction •126

•E •125
•113

•156 •174
•23
8
•A •O •T •E

•80 •76 •81 •93 •126 •144 •125 •113

•D •L •R •S •N •I •H

•40 •41 •61 •65 •71 •73 •58 •55

•C •U
•31 •27
•59
•Char•Freq
•270
•238
•174
•156

Huffman Code Construction •144


•126

•156 •174 •27


•23
0
8
•A •O •T •E

•80 •76 •81 •93 •126 •144 •125 •113

•D •L •R •S •N •I •H

•40 •41 •61 •65 •71 •73 •58 •55

•C •U
•31 •27
•60
•Char•Freq
•330
•270
•238

Huffman Code Construction •174


•156

•33
0

•156 •174 •27


•23
0
8
•A •O •T •E

•80 •76 •81 •93 •126 •144 •125 •113

•D •L •R •S •N •I •H

•40 •41 •61 •65 •71 •73 •58 •55

•C •U
•31 •27
•61
•Char•Freq
•508
•330

Huffman Code Construction •270


•238

•33 •50
0 8

•156 •174 •27


•23
0
8
•A •O •T •E

•80 •76 •81 •93 •126 •144 •125 •113

•D •L •R •S •N •I •H

•40 •41 •61 •65 •71 •73 •58 •55

•C •U
•31 •27
•62
•Char•Freq
•838

•508
•330
Huffman Code Construction •83
8

•33 •50
0 8

•156 •174 •27


•23
0
8
•A •O •T •E

•80 •76 •81 •93 •126 •144 •125 •113

•D •L •R •S •N •I •H

•40 •41 •61 •65 •71 •73 •58 •55

•C •U
•31 •27
•63
•Char •Freq •Fixed •Huff
•E •125 •0000 •110

Huffman Code Construction •0


•1
•T
•A
•93
•80
•0001
•0010
•011
•000
•O •76 •0011 •001
•0 •1 •0 •I •73 •1 •0100 •1011
•N •71 •0101 •1010
•S •65 •0110 •1001
•0 •1 •0 •1 •1
•0 •1 •R •61 •0 •0111 •1000
•H •55 •1000 •1111
•A •O •T •E
•L •41 •1001 •0101
•0 •1 •0 •1 •0 •1 •0 •1
•D •40 •1010 •0100
•C •31 •1011 •11100
•D •L •R •S •N •I •H
•U •27 •1100 •11101
•0 •1
•Total •838 •4.00 •3.62

•C •U

•64
Summary
 Huffman coding is a technique used to compress
files for transmission
 Uses statistical coding
 more frequently used symbols have shorter code words
 Works well for text and fax transmissions
 An application that uses multiple data structures

You might also like