You are on page 1of 7

th

th

4 International Conference on Electrical Engineering (ICEE 2015)

th

IGEE, Boumerdes, December 13 -15 , 2015

Adjustment of Active Contour Parameters in Brain


MRI Segmentation Using Evolution Strategies
Belgrana Fatima Zohra

Benamrane Nacra

Taleb Ahmed Abdelmalik

Department of Mathematics and


Informatics, University of Ain
Tmouchent Belhadj
BOUCHAB.
Ain Tmouchent, Algeria.
belgrana_fatimazohra@yahoo.fr

Department of Informatics, Faculty of


Mathematics and Informatics,
University of Sciences and
Technology of Oran Mohamed
BOUDIAF, USTO-MB.
Oran,Algeria.
nacera.benamrane@univ-usto.dz
nabenamrane@yahoo.com

IUT GE2I, Laboratoire LAMIH


UMR CNRSUVHC 8530, Universit
de Valenciennes
Abdelmalik.Taleb-Ahmed@univvalenciennes.fr

Abstract In this paper, we propose an approach for


segmentation of Brain Magnetic Resonance Images (MRI) using
an active contour model. We adopted a Greedy Algorithm which
is a simple and an effective method. However it requires an
adjustment of functional energy parameters. To overcome this
disadvantage we propose a hybridization with optimization
method. We apply the Evolutionary Strategy (ES) which proved
its success in numerical optimization.
Experimental results on brain MRI images reveal a clear
efficiency of this hybridization.
Keywords Magnetic Resonance Image; Segmentation;
Greedy algorithm; Active Contour; Evolutionary Algorithm
Evolutionary Strategy.

I. INTRODUCTION
Imaging technology in Medicine gives a valuable help for
doctors. It allows to see the anterior parts of the body for easy
diagnosis.It also helps in guiding or performing surgical
interventions. Magnetic Resonance Imaging (MRI), Computed
Tomography, Digital Mammography, and other imaging
modalities provide effective means for non-invasive mapping
of the anatomy of a subject.
Segmentation is the process of partitioning the image into
multiple segments (sets of pixels) to simplify its
representation. This is typically used to identify objects or
other relevant information in digital images.
These specific segments will be analyzed and interpreted
to detect the presence of eventual pathologies. It also alerts
radiologists to locations of suspicious lesions and provides
them with a second reading to reduce misdiagnosis [1] [2].
Therefore segmentation became an essential part of the
clinical work routine. A large number of different approaches
have been developed over the last few decades.
Active contours have been used for image segmentation
and boundary tracking. They are introduced by kass in 1987
[6] under the context of edge-based segmentation.

Unlike most other techniques used to find contour, this


model is active and the boundaries detected are continuous. It
is based on minimization of functional energy which is
associated with the curve. That means the problem of finding
an object boundary is resolved as an energy minimization
process.
There are two general types of active contour models in the
literature today: parametric [4] and geometric active contours
[5, 6]. The first proposed model is a parametric named snak
[6]. Afterwards, a number of methods have been proposed to
improve the snakes performance [7][8].
In this paper, we focus on parametric model. Several teams
were interested in this approach. Amini introduced the
method of dynamic programming [9], whereas William and
Shah used variationnal approach completing thus the classic
approach by the Greedy algorithm [10]. This algorithm is
faster than Aminis model, easy to implement and very
efficient. It is an iterative algorithm that precedes whit notion
of neighborhood, but like every parametric model, weighting
coefficients of the functional energy, have an important impact
on the quality of the segmentation results.
The
use
of
an
optimization
method
is
required. The purely analytic model has widely shown their
proofs. Other methods combining mathematical analysis and
random exploration allowed to overcome this limitation. It is
Evolutionary Algorithms (EAs) [11], a generic population
based metaheuristic optimization.
EAs are based on the analogy with the natural evolution of
Darwin [12] and they includes similar techniques that differ in
the implementation details and the nature of the treated
problem [13][14].
The idea of this work is to realize a segmentation of brain
MR Image to simplify their interpretation. We use in our
implementation a Greedy algorithm and resolve the problem
of parameterization with the evolution strategies.

2015 IEEE

The present paper is organized as follows: section II


reviews the principle of the adopted methods which are
parametric active contour and evolutionary algorithm. Section
III provides a detailed description of our proposed
hybridization between the Greedy algorithm and the evolution
strategies. Section V shows the experimental results that
demonstrate the effectiveness of our proposed approach.
Finally in section IV are provided summary and concluding
notes.
II.

ADOPTED TECHNIQUES

The context energy (


) which allows the introduction of a
priori information about the image was developed by
L.D.Cohen [16]. The most used
is the Balloon energy
(
) which guides the contour curve to expand or to
shrink in order to fit to the target boundaries. It represents a
pressure force pushing outside as if we introduced air inside or
retracting force depending on the used sign as demonstrated in
the following equation:
= . (s)=

(5)

A. Active Contour
Active contour model is based upon the utilization of
deformable contours which conform to various object shapes
and motions, it is based on a mathematic implementation.

k is a constant indicating the amplitude of this force, and


is a unitary vector to the curve.

There are two major approaches in active contour: explicit


approach represented by snack, and implicit one referring to
geodesic methods. Thus, the snack explicitly moves the active
contour points based on an energy minimization scheme,
while the level set approaches move the contours implicitly as
a particular level of a function.

Evolutionary algorithms are an imitation of the natural


evolution of Darwin. They are optimization, iterative and
stochastic techniques since they use iteratively a random
process. They evolve a set of solutions (population) of some
problems using an objective function, in order to find the best
results.

The snack is a parametric curve which tries to move into a


position where its energy is minimized. Three main energies
were proposed by Kass [6]. They are associated to a set of
weights as indicated in the following energy functional:

It is assumed that the individual directly represents a point


in the search space. However in the practice the coding and
the decoding process is generally required. After this stage,
evolutionary operators modify the initial population towards
generations.

(1)

1) Internal Energy (Eint) :


It depends on the intrinsic properties of the curve. It is the
sum of the elastic and bending energies:
=

( )

( ) + ( )

(2)

and
are terms denoting the first and second
derivatives of V with respect to s. s is the spatial position of
snacks points.
and
are real parameters controlling
respectively the tension and the rigidity of the contour.
The Internal Energy that we have analyzed controls the
regularization of the active contour while the External Energy
(
) corresponds to the adequacy with data. In fact it denotes
a scalar function defined on the image plane where the local
minimum of this energy attracts the snakes to edges.
Different types of
exist in the literature like gradient,
intensity or Gradient Vector Flow (GVF) [15].
A common edge attraction functions are given by the
following formula:
Gradient:
Intensity:

( , )=

( , )|

( , )= ( ,

(3)
(4)

Where G denotes a Gaussian smoothing filter with the


standard deviation and is a suitably chosen constant.

B. Evolutionary Algorithm

The first step of an EA is the random initialization of a


population. An objective function is then evaluated depending
on the problem to be resolved for all individuals.New
generations will be created tanks to the evolutionary operators.
Thus mutation and recombination will be applied on selected
individuals according to their fitness for the production of
offspring with a certain probability.
Finally, a computation of offspring fitness values is
realized and the new generation of offspring replaces the old
one. This cycle is performed until the optimization criteria are
reached.
Selection operator: the basic idea is to select better
individuals to reproduce. The instances with a bigger fitness
value should be selected proportionally more times than the
others. Many mechanisms of the selection operator are known
in EAs, we quote: general and steady state selection, elitist and
non-elitist selection, sharing or crowding selection.
Recombination and mutation: these tow operators are
responsible of the offspring creation. They introduce variation
in the population. The offspring inherits some traits of their
parents. They however allow the creation of completely new
traits. Different types of recombination exist depending on the
individual representation. For the real valued recombination;
we quote the intermediate, line or extended recombination.
Otherwise for the binary valued recombination there are a
single-point, a double-point, a multi-point, a uniform and a
shuffle crossover.
By mutation individuals are randomly altered. This
operator is needed to explore new areas of the search space
and helps the search procedure to avoid to be sticking in local
optima. It is applied after recombination process.

Based on mutation steps and rates two approaches exist


differing in the mutation parameters:
When they are constant during a whole evolutionary
run (for the mutation of real variables and mutation of
binary variables)
One or all parameters are adapted according to
previous mutations. It is a specific mutation for the
evolutionary strategy.
The family of evolutionary algorithms contains four
principal members which are the Genetic Algorithm (GA), the
Genetic Programming (GP), the Evolutionary Programming
(EP) and the Evolutionary Strategies (ES). They have been
successfully applied to numerous problems from different
domains including optimization, automatic programming,
machine learning, operations research, bioinformatics, and
social systems.
Each of these algorithms has its own field of application.
That is why during our study we adopted the ESs which
represents a most commonly applied to black-box
optimization problems in continuous search spaces and that
corresponds perfectly to the context of our issues.
III.

PROPOSED

HYBRID APPROACH

A. Greedy algorithm
The greedy approach is an energy-minimizing algorithm
introduced for 2D contours [17]. It is a segmentation
technique working like an elastic band being stretched around
an object to detect it.
The first step of this algorithm is to initialize a set of points
as the first contour around the feature to be extracted which is
explicitly defined. This algorithm is iterative and during each
iteration one point of the contour will be substituted by the
point which minimizes the functional energy in the local
neighborhood.
The objective of this method is to find a set of points
=
) of
,..,
, , which minimizes a total energy (
the contour:
( )=

( )= ( )

(6)

A variant to make the algorithm even faster is proposed by


Lam and Yan [17] when for a window of 3 x 3 neighborhood
pixels (8 neighbors) they examine only four. If one of these
four improves the total energy then it is not necessary to go
further. Otherwise, the four remaining are examined. This
increases the number of iterations for achieving a convergence
but decreases the computation time of each iteration.
This algorithm is in the context of variational approach of
active contour. The form of the functional energy used is
similar of that proposed by Williams and Shah [8]. They
discretize the expression:
=

( )

( ) + ( )

(7)

The user is free to add other energies like intensity or


balloon defined in the previous section. Thus the total energy
manipulated by the greedy algorithm is:
=

+ .

+ .

+ .

+ .

(8)

The parameters ,
and
are used to balance the
influence of different types of forces manipulated by the
algorithm. They are fixed by the user who determines the
segmentation object of the analyzed image. In general these
parameters are determined through the method trial
mistake.
In this work, we will introduce the principle of the
automatic setting of parameters through the use of an
evolutionary algorithm more precisely an evolutionary
strategy which is the best optimization method for numerical
optimization.
B. Evolutionary Strategy
Evolution strategies also known as evolutionary strategies
are search paradigms inspired by the principles of biological
evolution introduced by Ingo Rechenberg [11]. It was further
developed by Hans-Paul Schwefel [14].
They belong to the family of evolutionary algorithms that
address optimization problems. They are used in various field
of application and they are most commonly applied to blackbox optimization problems in a continuous search space.
ESs are based on the evolution of the solutions population
of the treated problem. This evolution is guided by a fitness
function which is maximized during the process.
ESs ensure a research in the complete field and
progressively through generations this research space is
refined towards potentially powerful subspaces.
ESs differ from traditional optimization algorithms in
some important items:
The search is from one population of solutions to
another and not from individual to individual.
They use only objective function information and not
derivatives.
They use probabilistic and not determinist transition
rules.
The ESs are based on the application of the mutation, the
recombination and the selection on populations of candidate
solutions. The principal is to create every generation a child
from a set of individuals using those operators.
We have to mention that the initial population in our case
is generated by the random generator instances.
Child and parent are individuals named genotype .After
coding step they will be called phenotype. The coding can be
real in ESs which is an important asset making this approach
solicited for parameters optimization.

1) Representation and modelisation


ESs were mainly designed to resolve problems purely
digital what explains our choice. However the natural
representation of the control variables as an n dimensional
real-valued vector X is entirely appropriate:
X = ( 1, 2, . . . ,

Where the Ni (0, 1) are independent random samples from


the standard normal distribution and =0.0873 (=5) .
We use two log-normal distribution factors for
=1

(9)

r=1

The difference with other evolutionary algorithm is that the


individuals in the ESs contain not only their position in the
space but also some information about their mutation

covariance Cij (the general

case).Several types of ESs can be created by exploiting these


two variables n
and n . We adopted the general case
when:
n

(13)

2. n

= 12

i j

Else

tan (2.

n
n
i 1 j j 1 RO( ij ).PU

N ( 0, Mc )

(11)

PU=N(0, )
X1

X2

Xn

(14)

The vector N (0, Mc) is created by:

n . ( n 1)
2

If

(10)

2. n

The covariance matrix Mc is generated as follows:

The representation of the solution may include up to n


different variances Cii and

i.

(15)

With

is a Diagonal matrix: [i] = [i][i]

The rotation matrix Ro ( ij) = [roij] for i,j=1n with:


Individual
Components

Variance

Covariance

roii = cos( ii)

Fig.1. Presentation of an individual (genotype) of ESs

roij = roji = - sin ( ii),

With x= (
, , , ),
, ,
are snack parameters.
2) Coding:
Like every evolutionary algorithm the individual coding
is an important step of the representation depending on the
optimization problem.
n

Traditionally GAs work on binary research spaces [0, 1]


otherwise the ESs work directly in Rn what make the ESs
more advantageous.
3) ES operators:
a) Mutation:
We choose to use the most popular mutation operator. It
is the Gaussian mutation which modifies all components of the
solution vector. This means that the variance i and the
covariance j are also modified during the mutation process
defined as follows:
x = x + N (0, Mc).
i = i. e(

j =

. ( , )

j + . Ni(0,1)

( , ))

i = 1. n

j = 1 n

(12)

(16)

It is important to note that the modification of the values of


the covariance (in this case) keeps a positive definite matrix.
b) Crossover
In the general case the crossover consists of applying a
procedure with some probability on selected individuals to
create other individuals (childs) which must inherit some
characteristic from their parents.
This probability represents the rate of crossover. It is related to
other parameters like size of population, rate of the mutation
and the type of the selection operator.
The most used rate of the crossover is in the interval [0.45,
0.95].
In our approach we had focused on the mutation operator.
c) Selection
Each evolutionary algorithm needs a goal oriented
selection operator in order to guide the search into promising
individuals. Selection is thus the antagonist to the variation
operators (also referred to as genetic operators) which are
mutation and recombination.
The best Strategy of selection is the elitist politicy. In the
passage from generation to another a set of individuals are

selected to survive by applying the rule: it is the best which


survives. This set of individual is selected depending on
the choice of the evolution diagrams:
The ( + )-ES diagram ( = + ) : in this diagram
not only one offspring is created at a time or in a
generation but
1 descendants. To keep the
population size constant the worst individuals from
+ individuals are discarded.
The (, )-ES diagram ( = ): in this diagram the
selection is realize among the offspring only. The
parents are forgotten no matter how good or bad their
fitness was compared to that of the new generation.
Obviously, this strategy relies on a birth surplus on >
in a strict Darwinian sense of natural selection.
We apply the ( , )-ES diagram since it is better for the
adaptation of the environments change. This represent the
reason for what ESs were originally developed. This makes it
different from the other EAs when every generation contains
the best individuals of the child.
d) Adaptation fitness:
The fitness distribution is used to calculate some properties
of truncation selection. The state of the population is
completely described by the fitness values of all individuals.

1.2 Initialization of variance and covariance


2. Repeat
For I =1 to :
The Mutation
of the chromosome (creation of
offspring)
The computing of the fitness function
using a
Greedy algorithm
The Selection operator with (, )-ES diagram
The replacement of the population of parents by the
population of offspring
Until detection (number of iteration reached or optimal
detection realized)

Generation of
the
initial
population
( of
x=
( , , , , ))

After the training step a test step is required to validate the


hybrid approach. The results of implementation are presented
in the next section.
C. Algorigram of the proposed hybridization
1. Initialization of chromosomes:
1.1 We have four parameters , , and to set. The
intervals of solution for each parameter are
respectively:
{

min,

max},{

min,

max},{

min,

max},{

min,

max},

Termination
criterion

Best
Individual

Optimisation
solution

Gaussian Mutation

The fitness function will evaluate the result of the greedy


algorithm convergence. It uses functional energy parameters
encoded by the chromosome.

We applied a supervised approach by a learning step on a


set of images (data base). On each image we placed the
contour points manually and thus which represent optimal
contour defined by the help of an expert. Then we tried to
minimize the area enclosed between the optimum contour and
the contour obtained by the algorithm. The evaluation function
of the evolution strategy uses the greedy algorithm to
determine the fitness a parameter set.

Yes

No

In our case, the purpose is to minimize the objective


function . Thus more the individual will have a low fitness it
will be considered as a good solution for the optimization
problem.

The best solution will be obtained by the setting of


and
to zero and , , and to . But those values cannot give
a good detection. For this reason the good parameter will be
designed when we obtain a good edge. This is achieved when
the snack points cover the desired contour which must be
known previously.

Objective
Function
(Greedy
Functional
energy)

Selection

( , )-ES

Remplacement

Fig.2. Principal steps of our ESs

IV.

EXPERIMENTAL RESULTS

To study the performance of our approach and the impact


of the proposed hybridization we applied the algorithm
developed on artificial images and on brain MRI Images. The
MRI data set is obtained from clinics. The images are in
grayscale with the size 256 256 pixels. The implementation
of the proposed approach has been realized with C++
language. We used the C++Builder which is a rapid
application development environment.
Active contour is sensitive to noise therefore we applied to
the initial images median filter. This filter considers each pixel
in the image in turn and looks at its nearby neighbors to decide
whether or not it is representative of its surroundings. It is
used to eliminate unnecessary information and it allows
reducing noise with preserving useful detail in the image.
After the smoothing, two steps were realized. We started
with the learning step in order to determine optimal values of
the functional energy parameters. During this step we
conclude that the coefficients controlling the continuity, the
curvature and the gradient are more sensitive than those that
control the intensity or balloon energy

Once parameters are set we apply a test step on a set of


images. Firstly, the proposed method is tested on a synthetic
image (figure 3).It was also tested on brain MRI image (figure
5,6).

(a)
(a)

(b)

(b)

(c)

Fig.3. Evolution of the snack on artificial images.


(a) First edge initialization. (b) After 15 iterations. (c) after 50 iterations.
Result of parameter setting = 0.1, = 0.2, = 0.1 = 0.4

(c)
Fig.5. Segmentation using par ameter resulting from learning approach.
(a) Initial image with median filter. (b) Initial image with first edge
initialization. (c) Lesion detection after 50iterations

V. CONCLUSION

(a)

(b)

(c)
Fig.4. Result of brain MRI image segmentation using parameter setting:
= 0.1, = 0.1, = 0.2 = 0.2
(a) Initial image with median filter (b) Initial image with first edge
initialization, (c) Lesion detection

Medical image segmentation is a research field that is


fascinating and very important. Many others were interested in
investigating it using different approaches.
The objective of our study is the optimization of the
parametric active contour model. The purpose was to automate
The setting process of the functional energy parameters to
optimal values allowing a good detection. We adopted an
optimization method inspired from natural evolution.
We have proposed a hybrid approach for brain MRI
segmentation. The Greedy algorithm has been implemented
for its efficiency, fastness and simplicity. We also used the
evolution strategy the best EA for a numerical optimization.
We were situated in a supervised global approach since it is
required to know the "ideal" edge which will be associated
with the "best" parameters in a learning step. We used the
functional energy of the Greedy algorithm as a fitness
function.
This case of study allows us to deduce the sensitivity of the
parameter. Effective parameters associated to the internal and
external energy are the most influential.
We demonstrated the performance of the proposed approach
using brain MR images. The experimental results for the
segmentation show the effectiveness of this hybridization.
ACKNOWLEDGMENT
We would like to thank Clinics which provided us with
brain MR images. The authors are grateful for the help
provided by Miss MABROUK Soumia for the realization of
the proposed approach software.

REFERENCES
[1]

[2]

[3]

[4]

[5]
[6]

[7]

[8]

[9]

[10]

[11]

[12]
[13]

[14]

[15]
[16]

[17]

W. P. Kegelmeyer Jr., Computer Detection of Satellite Lesions in


Mammograms, in Proc. SPIE Biomed. Image Processing, 1992, vol.
1660, pp. 446454.
J.Dehmeshki, Automated Detection of Nodules in the CT
Lung
Images Using Multi-Modal Genetic Algorithm, Proceedings of the 3rd
International Symposium on Image and Signal Processing and Analysis,
2003, pp. 393-398.
M.Raman and A. Himanshu, Study and Comparison of Various Image
Edge Detection Techniques, International Journal of Image Processing
(IJIP), Vol.3, no.1, pp: 1-11, Dec. 2009.
R. Malladi, J. A. Sethian, and B. C. Vemuri. Shape modeling with front
propagation: A level set approach. IEEE Trans.on Pattern Anal.
Machine Intell., 17(2):158175, 1995.
V. Caselles, F. Catte, T. Coll, and F. Dibos. A geometric model for
active contours. Numerische Mathematik, 66:1 31, 1993.
M. Kass, A. Witkin, D. Terzopouls, Snake: active contour models,
Proceeding of International Journal of Computer Vision, Jan. 1987,
pp.321331.
L. D. Cohen, I. Cohen, Finite-element methods for active contour
modelsand balloons for 2-D and 3-D images, IEEE Transl. on Patt.
Anal. Mach.Intell., vol. 15, 1993, pp. 11311147.
D. J. Williams, M. Shah, A fast algorithm for active contours and
curvatureestimation, CVGIP: Image Understanding, vol. 55, Jan. 1991,
pp. 1426.
CASELLES, V. KIMMEL, R. SAPIRO, G. - Geodesic Active
Contours. International Journal of Computer Vision, vol. 1, n 22,
1997, p. 61-79.
CASELLES, V. KIMMEL, R. SAPIRO, G. - Geodesic Active Contours.
Fifth International Conf. on Computer Vision (ICCV'95), Cambridge,
MA,USA, juin, 1995, p. 694-699.
I. Rechenberg, Evolution strategie: Optimierung technischer Systeme
nach Prinzipien der biologischen Evolution, FrommannHolzboog,
Verlag, 1973
Darwin, C. (2013). L'origine des espces. Seuil.
T.Bck, , G.Rudolph,, & H. P. Schwefel,. (. Evolutionary programming
and evolution strategies: Similarities and differences. In In Proceedings
of the Second Annual Conference on Evolutionary Programming,
February, 1993.
H. P. Schwefel, Evolution strategies: A family of non-linear
optimization techniques based on imitating some principles of organic
evolution, Annals of Operations Research, 1(2), 165-167, 1984
XU, C. PRINCE, J. L. - Snakes, Shapes, and Gradient Vector Flow.
IEEE Trans on Image Processing, vol. 7, n 3, mars 1998,p. 359-369
D.J. Williams and M. Shah, A fast algorithm for active contours and
curvature estimation, Computer Vision, Graphics, and Image
Processing: Image Understanding, vol. 55, no. 1, pp. 1426, 1992.
LAM, K.-M. YAN, H. - Fast greedy algorithm for active contours.
Electronics Letters, 6 Janvier, 1994, vol. 30, n 1, p. 21-23.

You might also like