You are on page 1of 5

International Journal of Scientific Research Engineering & Technology (IJSRET), ISSN 2278 0882

Volume 4, Issue 4, April 2015

A Modified Image Fusion Using Guided Filtering and


Bilinear Interpolation
Leya G1, Biju V. G2
1

(M Tech Scholar, Department of Computer Science, College of Engineering Munnar, Kerala)


2
(Associate Professor, Department of Electronics, College of Engineering Munnar, Kerala)

ABSTRACT
A modified image fusion using guided filtering is
proposed in this paper. In this method, a guided filter is
used to decompose the source images into base layer and
detail layer. A weight map of the base layer is computed
by applying gaussian and laplacian pyramidal approach.
A fused base layer and detail layer are computed from
the weight map of the base layer and decomposed source
image output obtained detail layer respectively. The
fused base layer and fused detail layer are combined to
obtain the fused image and bilinear interpolation is
applied to get the resultant fused image. The
performance of the fusion method is evaluated using
Normalized mutual information, Yangs and Cvejics
metric. The results show that the proposed fusion
method is better than the existing image fusion using
guided filtering (IFUGF).

fusion method focused on solving an energy function for


fusion. The main disadvantage of this method is that
they over-smooth the fused image.
In Section 2, the existing image fusion using guided
filtering is discussed. Section 3, the proposed image
fusion method is described. Section 4 describes the
databases and the objective image fusion quality metric
used. Results and discussions are presented on Section 5.
Finally, Section 6 concludes the paper.

2. IMAGE FUSION USING GUIDED


FILTERING (IFUGF)
Fig 1 shows the block diagram of the image fusion using
guided filtering [7]. An average filter is used to
decompose the source images into base and detail layer.
Then the base and detail layer were fused using guided
filtering.

Keywords Gaussian pyramid, Guided filtering, Image


fusion, Interpolation, Laplacian pyramid

1. INTRODUCTION
Image Fusion can be defined as the process of
combining multiple input images into a single fused
image without the introduction of the distortion or the
loss of the information. It aims at integrating
complementary as well as redundant information from
the multiple images. Therefore the newly created fused
image should contain accurate description of the source
images and is applicable for various image processing
applications such as target recognition and the feature
extraction.
Many image fusion methods have been developed.
The basic image fusion methods were simple average,
simple maximum, PCA and DWT [1-4]. However, these
methods did not give a clear fused image. Generally,
fusion methods were classified as data driven, multiscale and optimization based fusion. Data driven image
fusion and multi-scale image fusion methods focused on
the different data representations such as data-driven
coefficients [5] and multi-scale coefficients [6]. These
two methods can well preserve the details of the
different source images. However, they may produce
color distortion in the fused image. Optimization based

Fig 1: block diagram of image fusion using guided


filtering.
The different steps are:
2.1 Decomposition of image into two-scale
As shown in the Fig 1, the source images were
decomposed into base layer and detail layer. Base layer
was obtained by applying average filtering to the source
images. Base layer was obtained as:
Bn I n * Z
1
where In is the nth source image, Z is the average
filter. After getting the base layer, the detail layer can be

www.ijsret.org

412

International Journal of Scientific Research Engineering & Technology (IJSRET), ISSN 2278 0882
Volume 4, Issue 4, April 2015

easily obtained by subtracting the base layer from the


source images. So it was obtained by using:
Dn I n Bn
2
2.2 Construction of weight map with guided filtering
In order to obtain the weight map, first saliency map
was constructed. Saliency map was constructed as
follows: First a high pass image was constructed by
applying laplacian filtering to the source images. After
obtaining the high pass image, the saliency map was
constructed by applying gaussian filtering to the local
average of the high pass image.
Next weight map Pn was obtained by comparing the
saliency map. This was obtained as:
1 if S k S k , S k ,, S k
n
1
2
N
Pnk
3
otherwise
0
Then the refined weight map can be easily obtained
by applying guided filtering to the weight map with the
corresponding source images. So the refined weight map
was:
WnB Gr1 ,1 Pn , I n
4

WnD Gr2 , 2 Pn , I n

where r , , r and 2 were the parameters of the


1

where is the regularization parameter. k and


k are the mean and variance of I in the window wk ,
w is the number of pixels in the window wk , and the
P k is the mean of P in the window wk . The filtering

output is:
ai

1
w

kwi ak

9
10

bi

1
w

kwi bk

11

Oi a i I i bi

where

The two parameters r and determine the filter size


and the blur degree of the guided filter.
2.3 Reconstruction of two-scale image
Fused base layer and detail layer can be obtained by
using the equation:

B nN1WnB Bn
D nN1WnD Dn

12
13

Then the fused image can be easily obtained by


combining the fused base layer B and the fused detail
layer D .The fused image was:
F BD
14

guided filter. WnB and WnD were the resulting weight


maps of the base and the detail layer.

3. PROPOSED METHOD
Fig 2 shows the block diagram of modified image fusion
using guided filtering and bilinear interpolation.

2.2.1 Guided Image Filtering


There are several edge-preserving filters such as guided
filter [8], weight-least square [9] and bilateral filter [10].
In this proposed fusion method, the decomposition is
done by using guided filtering.
The filtering process involves an input image P, a
guidance image I and an output image O. The guided
filter incorporates the additional information from a
given guidance image. The guidance image can be the
input image itself. The output of the guided filter is a
linear transform of the guidance image I in a local
window wk centered at a pixel k. The filtering output is
obtained as:
Oi ak I i bk
i wk
6
where wk is a square window of size 2r 1 2r 1 .
The linear coefficients a k and bk are constant in the
window wk . The coefficients can be directly solved by
using the linear regression. The equation is:
1
I P k P k

w iwk i i
ak
7
k
bk P k ak k

Fig 2: modified image fusion using guided filtering and


bilinear interpolation
The proposed method is explained as follows.
3.1 Decomposition of image into two-scale
The purpose of the two-scale decomposition is to
separate each source image into a base layer and detail
layer. Guided filtering is used for the image

www.ijsret.org

413

International Journal of Scientific Research Engineering & Technology (IJSRET), ISSN 2278 0882

414

Volume 4, Issue 4, April 2015

decomposition. So, the base layer of each source image


is obtained by applying guided filtering to the source
images. The base layer is:
Bn I n * Gr ,
15

layer. Wn is represented as the product of three quality


measures:

Here In is the nth source image, G is the guided


filter. The value of r and are set to 2 and 0.01. The
detail layer is obtained by subtracting the base layer
from the source images and is:
Dn I n Bn
16

Contrast: The contrast measure of the base layer is


increased by applying a laplacian filter to the gray scale
version of the base layer.
Saturation: The saturation measure of the base layer is
computed as standard deviation within the red, green and
blue channel.
Exposure: The exposure measure of the base layer is

3.2 Construction of weight map using laplacian and


gaussian pyramid
The gaussian pyramid constructs a pyramid by low-pass
filtering and down-sampling the preceding level of the
pyramid. The base level of the gaussian pyramid is the
input image. The pyramid is constructed by low-pass
filtering the image with a filter. The reduced image is
constructed by removing every other pixel in the
preceding image. This process is repeated to form the
gaussian pyramid G0 , G1 , G2 ,, Gd .The Gaussian pyramid
is defined recursively as follows:
G0 x, y I x, y
for level l 0
17

Gl x, y w m, n Gl 1 2 x m,2 y n
m n

otherwise

18

where w m, n is the weighting function, also called


the generating kernel. The value of l lies between 0 and
d. l represents the number of levels in the pyramid.
After obtaining the gaussian pyramid, the laplacian
pyramid can be easily constructed by expanding the
image G1 to the same size as G0 and hence subtracting
obtains the image L0 . A laplacian pyramid
L0 , L1 , L2 ,, Ld 1 can be obtained by using the equation:
Ll Gl Gl 1

19

l 1, , d 1

The expanded image Gl 1 is obtained as:

Gl 1 4 w m, n Gl 2 x

m n

m
,2y
2

n
2

20

In IFUGF, the weight maps are noisy and hence the


fused image is not clear. So the pyramidal approach is
used for constructing the weight map. And also the
proposed fusion method increases the three quality
measures such as contrast, saturation and exposure of the
base layer in order to get a clear fused image.
In this step, firstly generate the laplacian pyramid of
the base layer i.e, L Bnl . The gaussian pyramid
G Wnl is obtained by using the three quality measures

such as contrast, saturation and exposure of the base

21

Wn Cn Sn En

obtained by applying Gaussian curve i.e: exp

i 0.5
2 2

and value of is set to 0.2. And apply the Gaussian


curve to each color channel and multiply the results and
thus gives the exposure measure.
The pyramid is easily obtained by multiplying
L Bnl with the corresponding G Wnl and thus obtains
the weight map. It is:

Ll L Bnl G Wnl

22

3.3 Reconstruction of two-scale image


In order to reconstruct the image, fused base layer B is
constructed by using the equation:

B ld0 Ll

23

Fused detail layer D is obtained by using the


equation:
D
D nk1 k
24
N
In this is the parameter defined by the user which
is set to 5. N is the number of source images.
Finally the fused image is obtained by using the
equation:
F BD
25
3.4 Applying Bilinear Interpolation to the fused image
The proposed method applies bilinear interpolation to
the fused image in order to obtain the resultant fused
image. The bilinear interpolation aims at improving and
enhancing the quality of the fused image. The pixel in a
grid is replaced by the average of the four closest pixels.

4. DATABASES AND OBJECTIVE IMAGE


FUSION QUALITY METRIC USED
In order to evaluate the effectiveness of the proposed
image fusion method, different databases are considered.
4.1 Databases Used
The databases used are 1) Multi-focus image database
[7] which contains 10 pairs of multi-focus images.
2) Multi-exposure image database [7] which contain 2

www.ijsret.org

International Journal of Scientific Research Engineering & Technology (IJSRET), ISSN 2278 0882
Volume 4, Issue 4, April 2015

pairs of color multi-exposure images and 8 pairs of


multi-modal images. 3) Natural images which are taken
under different exposure settings.
4.2 Objective Image Fusion Quality Metric
In order to evaluate the fusion performance, three fusion
quality metrics is used. Normalized mutual
information QMI , Yangs QY and Cvejics QC
metric.
4.2.1 Normalized mutual information QMI metric
It uses the mutual information [11] and is defined as:
MI A, F
MI B, F
QMI 2

26

H A H F H B H F
where H(A), H(B) and H(F) are the marginal entropy
of A, B, F and MI(A,F) is the mutual information
between the source image A and the fused image F and
is defined as:
MI A, F H A H F H A, F
27
Where H (A, F) denotes the joint entropy between A
and F. H (A) and H (F) are the marginal entropy of A
and F respectively. MI (B, F) is similar to the MI (A,F).
The value of Normalized mutual information metric
ranges from 0 to 1.
4.2.2 Yangs metric QY
It uses the structural similarity (SSIM)[12] for
measuring the performance of the fusion. It is defined as
follows:
w SSIM Aw , Fw 1 w SSIM Bw , Fw ,

if SSIM Aw , Bw 0.75

Qy
28
max SSIM Aw , Fw , SSIM Bw , Fw ,

if SSIM Aw , Bw 0.75

where w is a 7 7 window. A and B are the source


images. F is the fused image. SSIM is the structural
similarity index. The local weight w is calculated as
follows:

s Aw

s Aw s Bw

29

where s Aw and s Bw are the variance of source


images A and B within the window w. The value of
Yangs metric ranges between 0 and 1.
4.2.3 Cvejics metric Qc
Uses the UIQI[13] and it is defined as follows:
Qc Aw , Bw , Fw UIQI Aw , Fw

1 Aw , Bw , Fw UIQI Bw , Fw

30

where w is a 7x7 window. A and B are the source


images. F is the fused image. UIQI is the universal
quality index. The value of Aw , Bw , Fw is calculated
as follows:

0
if AF 0

AF
BF
AF
AF
Aw , Bw , Fw
if 0 1 31
AF BF
AF
BF

1
if AF 1
AF
BF

AF and BF are the covariance between A,B and


F. Cvejics metric ranges from 0 to 1.

5. RESULTS AND DISCUSSION


The efficiency of the proposed fusion method is
compared with the existing image fusion using guided
filtering. The proposed method is applied on different
databases and the natural images which are taken under
different exposure settings. The result is shown in the
Table 1. By analyzing the Normalized mutual
information QMI metric, the proposed method can well
preserve the original information from source images
than IFUGF. The Normalized mutual information QMI
metric is 18% better than IFUGF. The proposed method
can well maintain the structural information of the
source images when Yangs QY metric is analyzed.
The performance shows that the Yangs QY metric is
0.5% better than the existing image fusion using guided
filtering. Finally, the Cvejics Qc metric is 41% better
than the existing method. So the proposed method can

minimize the distortions in the fused image than


IFUGF.
Table 1: Comparison of proposed method with
IFUGF
INPUTS

QY

QMI

QC

IMG 1

IFUGF
0.686

PM
0.871

IFUGF
0.996

PM
0.999

IFUGF
0.245

PM
0.686

IMG 2
IMG 3
IMG 4
IMG 5
IMG 6

0.648
0.821
0.773
0.636
0.271

0.768
0.889
0.916
0.780
0.702

0.997
0.998
0.997
0.991
0.984

0.999
0.999
0.999
1.000
1.000

0.421
0.411
0.452
0.384
0.301

0.730
0.821
0.717
0.923
0.812

The two multi-exposure images are taken as the


input images and is shown in the Fig 2(a) and Fig 2(b).
The existing image fusion method using guided filtering
is applied on the multi-exposure images and the result is

www.ijsret.org

415

International Journal of Scientific Research Engineering & Technology (IJSRET), ISSN 2278 0882
Volume 4, Issue 4, April 2015

shown in the Fig 2(c). The resultant fused image of the


proposed fusion method is shown in the Fig 2(d).

on the fused image. Guided filter is used to make full


strong correlation between the neighborhood pixels. The
result shows that the proposed method can well preserve
the original and the complementary information of the
input images. The proposed method is very efficient and
is suitable for various real applications.

REFERENCES

Fig 2(a): source image1

Fig 2(b): source image 2

Fig 2(c): existing fused

Fig 2(d): resultant fused


image

image

The two multi-focus images in the database are


shown in the Fig 3(a) and Fig 3(b) and is taken as the
source images. The existing fused image is shown in the
Fig 3(c) and the resultant fused image obtained from the
proposed method is shown in the Fig 3(d).

Fig 3(a): source image1

Fig 3(b): source image2

Fig 3(c): existing fused


image

Fig 3(d): resultant fused


image

6. CONCLUSION
This paper put forwards an image fusion method using
guided filtering and performs the bilinear interpolation

[1] S. Li, J. Kwok, I. Tsang, and Y. Wang, Fusing


images with different focuses using support vector
machines, IEEE Trans. Neural Netw., vol. 15,no. 6,pp.
1555-1561, 2004.
[2] G. Pajaras and J. M. de la Cruz. A wavelet-based
image fusion tutorial, Pattern Recognit., vol. 37, no. 9,
pp. 1855-1872, 2004.
[3] D. Looney and D. Mandic, Multiscale image fusion
using complex extensions of EMD, IEEE Trans. Signal
Process., vol. 57, no. 4,pp. 1626-1630,2009.
[4] M. Kumar and S. Dass, A total variation-based
algorithm for pixel-level image fusion,IEEE Trans.
Image Process. vol. 18, no. 9, pp. 2137-2143, 2009.
[5] J. Liang, Y. He, D. Liu, and X. Zeng, Image fusion
using high order singular value decomposition, IEEE
Trans. Image Process., vol. 21, no. 5,pp. 2898-2909,
2012.
[6] P. Burt and E. Adelson, The laplacian pyramid as a
compact image code, IEEE Trans. Commun., vol. 31,
no.4,pp. 532-540,1983.
[7] S. Li, X. kang, J. Hu, Image Fusion with guided
filtering, IEEE Trans. Image Process., vol. 22, No. 7,
2013.
[8] K. He, J. Sun,X. Tang,Guided image filtering, in
Proc. Eur. Conf. Comput. Vis., Heraklion, Greece, pp.
114, 2010
[9] Z. Farbman, R. Fattal, D. Lischinski, and R. Szeliski,
Edge- preserving decompositions for multi-scale tone
and detail manipulation, ACM Trans. Graph., vol. 27,
no. 3, pp. 67-1-67-10, 2008.
[10] F.Durand and J. Dorsey, Fast bilateral filtering for
the display of high dynamic range images, ACM Trans.
Graph., vol. 21, no. 3, pp. 257-266, 2002
[11] M. Hossny, S. Nahavandi, and D. Creighton,
Comments on information measure for performance of
image fusion, Electron. Lett. vol. 44, no. 18, pp. 10661067,2008.
[12] Z. Wang, A. Bovik, H. Sheikh, and E. Simoncelli,
Image quality assessment: From error visibility to
structural similarity, IEEE Trans. Image Process., vol.
13, no. 4, pp. 600-612, 2004.
[13] Z. Wang and A. Bovik, A universal image quality
index, IEEE Signal Process. Letters, vol. 9, no. 3, pp.
81-84, 2002

www.ijsret.org

416

You might also like