You are on page 1of 5

International Journal on Recent and Innovation Trends in Computing and Communication

Volume: 2 Issue: 7

ISSN: 2321-8169
2002 2006

_______________________________________________________________________________________________

Performance Measurement of Huffman Coding based Improved SPIHT Image


compression Algorithm
Mr. Amarsinh B. Farakte

Mr. S.B. Patil

Electronics & Telecommunication


Engg. SGMCOE, Mahagaon
Gadhinglaj (MH), India
amar_farakte@rediffmail.com

Electronics & Telecommunication Engg


D.Y.P.C.E.T. Kasaba Bawada
Kolhapur (MH), India
s_b_patil200@rediffmail.com

Miss Mohini B. Rasale


Electronics & Telecommunication Engg
D.Y.P.C.E.T. Kasaba Bawada
Kolhapur (MH), India
mohini_rasale@yahoo.co.in

Abstract-Recent development in image processing requires a compression algorithm with high speed and better performance in terms of quality
matrices. In this paper, an improved SPIHT algorithm based on Huffman coding for image compression are presented. The results are
successfully tested on different gray scale images by changing the compression rate. The best results are obtained by keeping decomposition
level at five. The traditional SPIHT algorithm degrades the performance in terms of PSNR and Compression Ratio. The Huffman coding based
improved SPIHT algorithm gives better performance as compared with traditional SPIHT algorithm. The input gray scale image decomposed by
wavelet filter type bior4.4 and relative wavelet coefficients are indexed and scanned by DWT, then applied to encode the coefficient by SPIHT
encoder. The generated image code vector of type UINT8 again encoded by Huffman Encoder which gives the compressed image in the form of
encoded data. At the other side, reverse process carried out to get the reconstructed image.
Keywords- PSNR, MSE, Compression Ratio, DWT, IDWT.

__________________________________________________*****_________________________________________________
I. INTRODUCTION
Image compression addresses the reduction in the amount
of data required to represent the digital image. Image
Compression is achieved by the removal of one or more of
three basic data redundancies as listed;
(1) Coding redundancy, which is present when less than
best possible code words are used;
(2) Inter pixel redundancy, which results from correlation
between the pixels of an image;
(3) Psycho visual redundancy, which is due to data that is
unnoticed by the human visual system.
The multimedia files (graphics, audio and video) require
substantial storage capacity and broadcast bandwidth. As the
rapid progress in digital communication system performance,
demand very high space for data storage capacity and data
transmission bandwidth continuous to surpass the capabilities
of available compression schemes. The recent growth of data
demanding multimedia-based web page applications have not
only sustained the need for more competent ways to encode
signals and images but have made compression of such signals
essential to store in communication technology.
Huffman coding is a form of statistical coding technique.
The codes contain smallest possible number of code symbols
per source symbol (e.g. grey level value) subject to the
constraint that the source symbols are coded one at a time.
Huffman coding more frequently used symbols have shorter
code words and works well for text and fax transmissions.
Huffman coding gives better performance in terms of
Compression Ratio and PSNR when combined with SPIHT.
In this paper the combined compression scheme are discussed.
Section 2 summarizes the present compression schemes. We
propose Huffman based Improved SPIHT algorithm structure
in Section 3 and Quality Evaluation in Sect. 4, Section 5
summarizes Results and Discussion. Conclusions are
presented in Sect. 6.

II. PRESENT COMPRESSION SCHEMES


There are different types of image compression techniques
with their individual application area and advantages some of
those are discusses as follows;
A Embedded Zero tree Wavelet (EZW)
The EZW compression algorithm was one of the first
algorithms to explain the full power of wavelet-based image
compression. It was introduced in the ground breaking paper of
Shapiro. An EZW encoder is an encoder specifically designed to
use with wavelet transforms. [1] The EZW encoder was originally
designed to operate on images (2D-signals) but it can also be used
on other dimensional signals. [2], [3]
B Set Partitioning In Herichirical Trees (SPIHT)
The SPIHT method is not a simple extension to the
traditional methods for image compression, it also represents
an important advancement in the field. The basic principal is
the same as progressive coding is applied, processing the
image in order to a lowering threshold. The difference is in the
idea of zero trees (spatial orientation trees in SPIHT). This is
an idea that takes into consideration bounds between
coeffIcients across subbands at different levels. [4] If there is a
coeffIcient at the highest level of the transform in a particular
subband which considered insignifIcant against a particular
threshold, it is very probable that its descendants in lower
levels will be insignifIcant too.[5] Therefore we can code
fairly a large group of coeffIcients with one symbol.
The method provides the following:
Good image quality, High PSNR, principally for color images;
Progressive image transmission;
Produces a completely embedded coded file;
Uses simple quantization algorithm;
Lossless compression.
Exact bit rate or distortion possible;
Efficient combination with error protection.
2002

IJRITCC | July 2014, Available @ http://www.ijritcc.org

_______________________________________________________________________________________

International Journal on Recent and Innovation Trends in Computing and Communication


Volume: 2 Issue: 7

ISSN: 2321-8169
2002 2006

_______________________________________________________________________________________________
What makes SPIHT really excellent is that it yields all those
individuality simultaneously.

C. Wavelet Difference Reduction (WDR)


The WDR compression scheme uses ROI concept. WDR
introduced by Tian and Wells. It works as encodes the location
of considerable wavelet transform values. The WDR gives
better perceptual image quality than SPIHT and no searching
through quad trees as in SPIHT but suits only for low
resolution medical images at low bpp rate [6]. The WDR
algorithm is less complex and provides better preservation of
edges.
D. Set Partitioned Embedded bloCK coder (SPECK)
The SPECK image compression scheme is different from
some of the above mentioned schemes. It does not use code trees
which span, and develop the similarity, across different sub
bands; rather, it makes use of sets in the form of blocks. The main
concept is to exploit the clustering of energy in frequency and
space in hierarchical structures of transformed images. The
SPECK algorithm belongs to the class of scalar quantized
significance testing algorithm. It has its roots primarily in the
concept developed in the SPIHT algorithm, and few block coding
algorithms.

Figure 1 Wavelet Transform based Image Compression

B. Flow Chart of Improved SPIHT Compression

III. IMPROVED SPIHT ALGORITHM


The traditional SPIHT algorithm uses a spatial tree
structure[7]. Improved SPIHT algorithm also works on same
principal. The image wavelet coefficients encoded by SPIHT
encoder to generate the bit stream. The bit stream again applied
to Huffman encoder to generate the encoded data in the form of
1D string. Huffman coding requires less number of bits to
represent the encoded data which increase the compression
ratio to a great extend.
A. Wavelet Transform
Wavelet compression defines a way of analyzing an
uncompressed image data in a recursive manner, resulting in a
sequence of higher resolution images, each adding to the
information content in lower resolution images. The major
steps in wavelet compression are performing a Discrete
Wavelet Transformation (DWT), quantization of the waveletspace image sub bands information, and then encoding this
information. Wavelet images are not compressed images;
rather it is quantization and encoding block that performs the
image compression and to store the compressed image. Image
compression using wavelet innately results in a set of multiresolution images; it is well suited to working with large
imagery which needs to be selectively viewed at different
resolution, as only the levels containing the required level of
detail need to be decompressed. The Figure 1 shows the block
diagram of wavelet based compression scheme.

Figure 2 Main Flow chart of Compression Process

2003
IJRITCC | July 2014, Available @ http://www.ijritcc.org

_______________________________________________________________________________________

International Journal on Recent and Innovation Trends in Computing and Communication


Volume: 2 Issue: 7

ISSN: 2321-8169
2002 2006

_______________________________________________________________________________________________
C. Decompression Process:
The decompression process reverses the compression
process to produce the reconstructed image as shown in
Figure 5. The reconstructed image may have lost some
information due to the compression, and may have an error or
distortion compared to the original image; but this is not in the
case of SPHIT. Almost all the image information
reconstructed after the decompression process and the
reconstructed image look similar to the original one.

V. RESULT
The above results tested on 512x512 Man and Chest
medical image. We can apply any image size also; for
compression/decompression speed measurement performance
tested on Intel core 2 duo processor with 2GB RAM and
150GB HDD.
TABLE I. PERFORMANCE OF TRADITIONAL SPIHT ALGORITHM

Traditional SPIHT Algorithm with Decomposition level=5


Image

Man
Image
512x512

Chest
Medical
Image
512x512

Comp
Rate

PSNR
in dB

CR

Time
Response for
Compression

Time Response
for
Decompression

0.1

23.9

16.72

1.87 Sec.

3.46 Sec

0.2

26.39

8.36

4.05 Sec.

4.30 Sec.

0.3

27.94

5.57

6.91 Sec.

6.61 Sec.

0.4

29.34

4.18

10.32 Sec.

9.15 Sec.

0.5

30.54

3.34

14.02 Sec.

12.51 Sec.

0.6

31.4

2.78

21.82 Sec.

16.81 Sec.

0.1

37.77

19.56

2.20 Sec.

3.03 Sec.

0.2

42.52

9.77

3.71 Sec.

3.89 Sec.

0.3

45.14

6.51

5.72 Sec.

5.54 Sec.

0.4

46.79

4.88

9.62 Sec.

8.80 Sec.

0.5

48.16

3.91

12.81 Sec.

10.98 Sec.

0.6

49.17

3.25

17.88 Sec.

16.14 Sec.

TABLE II PERFORMANCE OF IMPROVED SPIHT ALGORITHM

Improved SPIHT Algorithm with Decomposition level=5


Image
Comp
Rate

PSNR
in dB

CR

Time
Response for
Compression

Time Response
for
Decompression

0.1

25.53

53.38

3.52 Sec.

3.59 Sec.

0.2

28.2

25.84

4.24 Sec.

5.20 Sec.

0.3

29.21

16.91

7.63 Sec.

8.95 Sec.

0.4

31.17

12.67

12.80 Sec.

14.36 Sec.

0.5

32.43

10.03

15.62 Sec.

16.54 Sec.

0.6

32.52

8.35

23.56 Sec.

24.57 Sec.

IV QUALITY MATRICES

0.1

39.17

63.65

2.18 Sec.

3.30 Sec.

The performance of the proposed scheme is gauged by


different parameters. The visual reconstructed image quality is
measured by the Peak Signal to Noise Ratio (PSNR). The
correlation between the original image and the reconstructed
image is monitored by the respective Mean Square Error
(MSE) values of each.
Further the different original and extracted images are
shown for different compression rate. The experimental result
shows that the proposed algorithm for image compression
gives better performance than the traditional SPIHT algorithm.

0.2

44.16

30.25

3.93 Sec.

4.52 Sec.

0.3

47.06

19.89

6.44 Sec.

6.67 Sec.

0.4

48.72

14.75

10.25 Sec.

10.74 Sec.

0.5

49.74

11.78

13.20 Sec.

13.52 Sec.

0.6

51.11

9.73

20.04 Sec.

19.02 Sec.

Man
Image
512x512
Figure 3 Main Flow Chart of Decompression Process

Chest
Medical
Image
512x512

Table I and II shows the comparison of Traditional


SPIHT with Huffman based Improved SPIHT algorithm image
compression algorithm by varying the compression rate. The
proposed algorithm well suited on medical images also. The
performance speed of compression/decompression can be
2004

IJRITCC | July 2014, Available @ http://www.ijritcc.org

_______________________________________________________________________________________

International Journal on Recent and Innovation Trends in Computing and Communication


Volume: 2 Issue: 7

ISSN: 2321-8169
2002 2006

_______________________________________________________________________________________________
enhanced by advance processer. Wavelet filter type used is
'bior4.4'.

Next proceeding section shows the performance


measurement of Traditional SPIHT and Improved SPIHT
algorithm. The Figure 6 shows the graphs of Compression Rate
verses Compression Ratio and Compression rate verses PSNR
value in db.

Figure 4 Original Medical Chest Image & Man 512x512 Image

a) Compression Rate Vs Compression Ratio

(a) Comp. rate= 0.1

(c) Comp. rate= 0.3

(b) Comp. rate= 0.2

(d) Comp. rate= 0.4

b) Compression Rate Vs PSNR in db

(e) Comp. rate= 0.5

(f) Comp. rate= 0.6

Figure 5 Reconstructed Man Image by changing Compression


rate from 0.1 to 0.6

The Figure 5 shows reconstructed images by changing


the value of compression rate. It is observed that as rate of
compression increases the value of PSNR also increases
which results into better quality of reconstructed image.
Change in compression rate also affect on compression ratio
as compression rate increase compression ratio decreases
further increases in compression rate results into more
number of pixels required to represent the compressed
image.

c) Compression Rate Vs Compression Ratio

2005
IJRITCC | July 2014, Available @ http://www.ijritcc.org

_______________________________________________________________________________________

International Journal on Recent and Innovation Trends in Computing and Communication


Volume: 2 Issue: 7

ISSN: 2321-8169
2002 2006

_______________________________________________________________________________________________
REFERENCES
[1]

[2]

[3]

[4]
[5]

Signal Processing,
Recognition 2013.

d) Compression Rate Vs PSNR in db


Figure 6 Traditional SPIHT and Improved SPIHT Results

Image

Processing

and

Pattern

[6]

R.Sudhakar, Ms R Karthiga, S. Jayaraman Image


Compression using Coding of Wavelet Coefficients A
Survey, ICGST-GVIP Journal, Volume (5), Issue (6),
pp 25-38 June 2005.

[7]

Amir Said, William A. Pearlman, A New, Fast


and Efficient Image Codec Based on Set Partitioning
in Hierarchical Trees. IEEE Transactions on Circuit
and Systems for Video Technology, VOL. 6, No.3, pp
243-250, June 1996.

VI. CONCLUSION
In this paper we are presented a Huffman coding based
Improved SPIHT algorithm for image compression. The
performance of Traditional SPIHT and Improved SPIHT
algorithm is compared in terms of PSNR, Compression
ratio, compression/decompression speed. The result shows
that the improved approach enhances the quality parameters
in all aspect and it gives almost three times greater
performance in compression ratio than traditional SPIHT.

Jerome M Shapiro Embedded Image coding using zero


trees of wavelets coefficients, IEEE Transactions on
signal processing, Vol 41, no. 12, pp 3445-3462, Dec
1993.
Asad Islam & Pearlman, An embedded and efficient
low-complexity, hierarchical image coder, Visual
Comm. and Image processing, proceeding of SPIE, Vol 3
653, pp.294-305, January 1999.
James S Walker "Wavelet-Based Image Compression"
The Transform and Data Compression Handbook, CRC
Press LLC, 2001.
A. Abu-Hajar and R. Sankar, "Enhanced partial-SPIHT
for lossless and lossy image compression," lCASSP 2003.
Sanjay H. Dabhole, Virajit A. Gundale, Johan Potgieter
An Efficient Modified Structure of CDF 9/7 Wavelet
based on Adaptive Lifting with SPIHT for Lossy to
Lossless Image Compression. International Conference on

2006
IJRITCC | July 2014, Available @ http://www.ijritcc.org

_______________________________________________________________________________________

You might also like