You are on page 1of 8

Fitness Sharing Genetic Algorithm with Self-adaptive

Annealing Peaks Radii Control Method


Xinjie Yu
State Key Lab of Power Systems, Dept. of Electrical Engineering,
Tsinghua University, Beijing 100084, China
yuxj@tsinghua.edu.cn

Abstract. Fitness sharing genetic algorithm is one of the most common used
methods to deal with multimodal optimization problems. The algorithm requires
peaks radii as the predefined parameter. It is very difficult to guess peaks radii for
designers and decision makers in the real world applications. A novel
self-adaptive annealing peaks radii control method has been suggested in this
paper to deal with the problem. Peaks radii are coded into chromosomes and
evolved while fitness sharing genetic algorithm optimizes the problem. The
empirical results tested on the benchmark problems show that fitness sharing
genetic algorithm with self-adaptive annealing peaks radii control method can
find and maintain nearly all peaks steadily. This method is especially suitable for
the problems whose peaks radii are difficult to estimate beforehand.

1 Introduction
It is well known that design and decision making problems can all be considered as
optimization problems. There are so many instances in engineering design that multiple
optimal designs exist in solution space. Common optimization algorithms can at best
find one optimal design. If designers and decision makers can make choices from
multiple optimal designs and decisions, there will be remarkable enhancements in the
efficiency and quality of designs and decisions.
As the representative of modern optimization algorithms, genetic algorithms have
achieved successes in function optimization, knowledge acquisition, and system
simulation [4]. The genetic operation is carried out with a group of individuals, so it is
possible for genetic algorithms to maintain all peaks in multimodal optimization
problems. However under the circumstance of finite population size and improper selective pressure, genetic algorithms can only converge to one solution [8].
Fitness sharing is an effective method which can maintain the population diversity of
genetic algorithms to find multiple peaks in the solution space [2, 5]. In multimodal
optimizing genetic algorithms, the sub-space around a peak in the solution space is
usually named as a niche. Individuals around a peak constitute a species. Fitness
sharing is to decrease the fitness of individuals in a species according to the scope of the
niche. If there are too many individuals in a niche, their fitness will decrease dramatically, which can encourage the species with fewer individuals to survive. Standard
L. Wang, K. Chen, and Y.S. Ong (Eds.): ICNC 2005, LNCS 3611, pp. 1064 1071, 2005.
Springer-Verlag Berlin Heidelberg 2005

Fitness Sharing GA with Self-adaptive Annealing Peaks Radii Control Method

1065

fitness sharing genetic algorithm adopts fitness sharing method before the selection
stage of genetic algorithms.
There are two difficulties in applying standard fitness sharing genetic algorithm in
real world applications. One is that fitness sharing method needs peaks radii
beforehand. In real world applications, it is very hard to estimate peaks radii of the
multimodal optimization problems. The other is that every peak is supposed to have the
same radius in fitness sharing method, which is not the true condition for real world
applications.
Many improvements have been suggested to enhance the search ability and overcome
the shortcomings of standard fitness sharing genetic algorithm. The representatives include clearing algorithm suggested by Petrowski [11], adaptive KMEAN cluster with
fitness sharing algorithm suggested by Yin and Germay [13], dynamic niche sharing
algorithm suggested by Miller and Shaw [10], adaptive niching algorithm suggested by
Goldberg and Wang [6]. These algorithms either need minimal peak radius or the peak
number in the solution space. They improve the performance of the standard fitness
sharing genetic algorithms in different ways. Sareni and Krahenbuhl suggest that adaptive peaks radii control is a good method for fitness sharing algorithms, but there is no
trail of optimization instances which have ever been reported [12].
Parameter control is very important in genetic algorithms research. Parameter control methods include optimal parameter tuning, deterministic parameter control, adaptive parameter control, and self-adaptive parameter control [3]. Among them,
self-adaptive parameter control method, which utilizes GAs to tune the parameter in the
optimization procedure, is the most flexible way of parameter control. Peaks radii are
the key parameters in fitness sharing genetic algorithms. In this paper, we adopt the
idea from simulated annealing algorithm and combine it with self-adaptive parameter
control. Every individual has its own peak radius, which is coded into its chromosome
as genes. The peaks radii are evolved in the problem optimization process. The procedure of the problem optimization is the procedure of peaks radii annealing. In this
way, peaks radii are no longer the predetermined parameters of fitness sharing algorithms. They might be different in the population, which expands the applicable range
of the algorithm.
In the next section, fitness sharing genetic algorithm with self-adaptive annealing
peaks radii control method is described in detail. Section 3 investigates the suggested
algorithm on several benchmark multimodal problems. Conclusions and discussions
are given in section 4.

2 Self-adaptive Annealing Peaks Radii Control Method


In binary simple genetic algorithm, each decimal variable needs to be transferred to
binary variables and combined with each other to form a chromosome. In the proposed
algorithm, the peak radius of each individual is viewed as a variable, and needs to be
coded and combined with other variables. The code length of the peak radius is lsigma.
The initialization of individuals is the same as the standard fitness sharing genetic
algorithms. The peaks radii genes are put in the last part of the chromosome. If lsigma

1066

X. Yu

genes of the peak radius are all zero after initialization, special treatment is needed to
ensure fitness sharing procedure can perform. In this case, we pick a peak radius gene
randomly and set the value of the gene as 1, which ensures that the radius of peak will
not be zero. Other steps of the algorithm can be expressed as follows.
Step one. Calculate the shared fitness of every individual. Suppose dij is the distance
between individual i and individual j, then the sharing function sh(dij) can be calculated
by using equation 1.

d ij

d ij < i
sh(d ij ) = 1 i
0
Otherwise

(1)

where is the peak radius of the individual i, is the parameter which controls the form
of the sharing function, commonly equals to 1. After the sharing function value of each
individual is calculated, niche count m of every individual can be calculated by using
equation 2.

mi = j =1 sh(d ij )
N

i = 1,2, L , N

(2)

where N is population size. It is obvious that if an individual has larger niche count,
there are more individuals around it. The shared fitness of every individual can be
calculated by using equation 3.

f i' =

fi

mi

i = 1,2, L , N

(3)

where f 'i is the shared fitness of the individual i and fi is its raw fitness. The following
selection stage uses the shared fitness.
Step two. Use the proper selection method to select N individuals from current
population.
Step three. Perform crossover and mutation to generate new individuals until the new
population size reaches N. The mating restriction strategy is adopted [2]. If an individuals peak radius changes to 0 after crossover and mutation, a random gene of the
peak radius is selected and the value of that gene changes to 1.
Step four. Decide whether to stop the algorithm or not. If stop criteria are not fulfilled,
go back to step one; otherwise, select the group with maximum fitness to be the peak
sets, or use cluster analysis method to search niche centers from current population to
find the peak sets.
There is no predefined parameter, such peaks radii or peak number, in the above
method. The peak radius of each individual may evolve during the procedure of optimization. At the early stage of the algorithm, all kind of peaks radii may exist. Because
peaks radii can affect the shared fitness of the individual and then affect the selective

Fitness Sharing GA with Self-adaptive Annealing Peaks Radii Control Method

1067

probability of the individual, those individuals whose fitness is relatively large and
peak radius is relatively small have selection superiorities. The initialization of peaks
radii is in a random way, so most of the individuals are with intermediate peaks radii in
the beginning. The algorithm is quite similar to standard fitness sharing genetic algorithm. The difference is that the peak radius is a global parameter in standard fitness
sharing genetic algorithm, which is predefined by the user. The algorithm can fully
explore solution space and find the convergent area of peaks during this period. With
the evolution goes on, better individuals will be found by GAs and peaks radii will
undergo a simulated-annealing-like procedure. Peaks radii will decrease
self-adaptively. At the late stage of the algorithm, with the effect of selective pressure
and genetic operator, individuals with large fitness value and small peaks radii get
flourishing. The algorithm is quite similar to crowding genetic algorithm [7]. The difference between them is that the offspring replace the parents according to the distance
and the fitness in crowding genetic algorithm but the replacement takes place directly
in the suggested algorithm. The algorithm can maintain population diversity and exploit the convergent area of peaks to locate them in this period.

3 Tests on the Novel Algorithm


In order to test the searching ability of fitness sharing genetic algorithm with
self-adaptive annealing peaks radii control method, 6 benchmark multimodal problems
are optimized [9].
3.1 The Description of the Benchmark Problems
Problem I can be expressed as follows:

F1( x ) = sin 6 (5 x )

(4)

The domain of the problem is [0, 1]. There are 5 evenly distributed peaks. The peaks
are at 0.100, 0.300, 0.500, 0.700, and 0.900 respectively. The heights of the peaks are
all 1.0.
Problem II can be expressed as follows:

F 2( x ) = e

x 0.1
2 (ln 2 )

0.8

sin 6 (5 x )

(5)

The domain of the problem is [0, 1]. There are 5 evenly distributed peaks. The peaks
are at 0.100, 0.300, 0.500, 0.700, and 0.900 respectively. The heights of the peaks are
1.000, 0.917, 0.707, 0.459, and 0.250 respectively. The ratio of the lowest peak height
to the highest peak height is 0.25.
Problem III can be expressed as follows:

[ (

F 3( x ) = sin 6 5 x 0.75 0.05

)]

(6)

1068

X. Yu

The domain of the problem is [0, 1]. There are 5 unevenly distributed peaks. The
peaks are at 0.080, 0.247, 0.451, 0.681 and 0.934 respectively. The heights of the peaks
are all 1.0.
Problem IV can be expressed as follows:

F 4( x ) = e

x 0.08
2 (ln 2 )

0.854

[ (

sin 6 5 x 0.75 0.05

)]

(7)

The domain of the problem is [0, 1]. There are 5 unevenly distributed peaks. The
peaks are at 0.080, 0.247, 0.451, 0.681 and 0.934 respectively. The heights of the peaks
are 1.000, 0.917, 0.707, 0.459, and 0.250 respectively. The ratio of the lowest peak
height to the highest peak height is 0.25.
Problem V can be expressed as follows:

) (

F 5( x, y ) = 2500 x 2 + y 11 x + y 2 7
2

(8)

The domain of the problem is [-6,6]*[-6,6]. There are 4 unevenly distributed peaks.
The peaks are at (3.000, 2.000), (3.584, -1.848), (-3.779, -3.283), and (-2.805, 3.131).
The heights of the peaks are all 2500.0.
Problem VI can be expressed as follows:
4

5
f ( x 0 , L, x 29 ) = u x 6i + j
i = 0 j =0

where

(9)

x k {0,1} , k = 0,L ,29 , and the definition of u(s) is:


s {0,6}
1
0
s {1,5}

u (s ) =
0
.
360384
s {2,4}

0.640576
s=3

(10)

The problem is called the massively deceptive problem. It contains 106 peaks with
only 32 global peaks. The reason of its massive deception is that (1) it has huge number
of peaks and (2) global peaks are surrounded by local peaks. This benchmark problem
is often used to test the performance of multimodal genetic algorithms. The global
peaks are at (000000, 000000, 000000, 000000, 000000), (000000, 000000, 000000,
000000, 111111), , (111111, 111111, 111111, 111111, 111111). The heights of the
global peaks are all 5.0.
3.2 The Parameters of the Algorithms
Table 1 lists the parameters for fitness sharing genetic algorithm with self-adaptive
annealing peaks radii control method in the test problems.

Fitness Sharing GA with Self-adaptive Annealing Peaks Radii Control Method

1069

Table 1. The parameters of the algorithm in test problems


Problem
Scale Type
Distance
Metric
Population
Size
Maximum
Generation
Chromosome
Length
Convergence
Criterion

I
No

II
No

III
No

IV
No

V
Power Law

VI
Power Law

Euclidian

Euclidian

Euclidian

Euclidian

Euclidian

Hamming

60

60

60

60

100

800

50

50

50

50

50

120

30+4

30+4

30+4

30+4

15+4

30+4

h<0.02

h<0.02

h<0.02

h<0.02

h<0.832

At Peaks

where h is the distance between the individual and the real nearest peak. SUS selection
[1] and single point crossover are adopted. The crossover probability is 0.9, and mutation probability is 0.05. lsigma is 4 in this paper, which is proved to be adequate for
most problems. The mating restriction method is employed. The algorithms are implemented in MATLAB environment. The problems are optimized on an IBM compatible PC with Intel Pentium 4 2.4G CPU and 512M memories. The performance
criterion for problem I, III, V, and VI is the number of global peaks maintained by the
algorithm at the end of the optimization, for problem II and IV is the number of peaks
maintained by the algorithm at the end of the optimization. The algorithm optimizes
every problem 20 times. Then calculate the average of the performance.
3.3 The Results
The optimization results of fitness sharing genetic algorithm with self-adaptive annealing peaks radii control method are listed in Table 2.
Table 2. The optimization results of the algorithm
Problem
peaks found

I
4.86

II
4.88

III
4.78

IV
4.80

V
4.00

VI
32.00

Figure 1 shows the number of global peaks found by the suggested algorithm during
the optimization procedure on the massively deceptive problem.
Problem I to IV represent the problems with or without uniformly distributed peaks.
The peaks may be of the same height or not. Fitness sharing genetic algorithm with
self-adaptive annealing peaks radii control method can change the peaks radii dynamically, thus find almost all peaks.
Problem V represents the problem with flat peaks, which needs power law scaling
method to adjust shared fitness [14]. The algorithm suggested in this paper can find and
maintain all peaks steadily.

1070

X. Yu

Number of Global Peaks

35

30

25

20

15

10

0
0

20

40

60

80

100

120

Generations
Fig. 1. Optimization result of problem VI

Problem VI is the massively deceptive problem. Simple genetic algorithm can either
find local peaks only or converge to one global peak. Fitness sharing genetic algorithm
with self-adaptive annealing peaks radii control method can find and maintain all the
global peaks steadily with the help of power law scaling.

4 Conclusions and Discussions


A new self-adaptive annealing peaks radii control method is suggested in this paper.
Combined with standard fitness sharing genetic algorithm, the method can find and
maintain all peaks in the search space without the information of peaks radii or the
number of peaks. Peaks radii are coded into chromosomes of the individuals, and undergo a simulated-annealing-like procedure during optimization procedure. The early
stage of the algorithm is similar to standard fitness sharing genetic algorithm, and the
late stage of the algorithm is similar to crowding genetic algorithm.
The optimization results of several benchmark multimodal problem show that fitness sharing genetic algorithm with self-adaptive annealing peaks radii control method
is suitable for a wide range of multimodal problems. The suggested algorithm can be
used to search for both all peaks and all global peaks. If all global peaks are needed,
additional scaling method might be adopted to eliminate the local peaks.
The proposed algorithm is similar to crowding algorithm in late stage. So the
population diversity can be preserved, but the convergence of all individuals to peaks
cannot be pledged. If the distribution of individuals according to the peaks height is
required, other local search or cluster analysis methods may be adopted to prompt
convergence.
Further research may be carried out in the aspects of the relationship between
population size, coding length of peaks radii and the optimization results of the algorithm; more extensive tests on the algorithm; and applications of the algorithm in real
world problems.

Fitness Sharing GA with Self-adaptive Annealing Peaks Radii Control Method

1071

References
1. Baker, J. E.: Reducing Bias and Inefficiency in the Selection Algorithm. In: Grefenstette,
J.J. (eds.): Proceedings Of the Second International Conference on Genetic Algorithms and
Their Applications. Hillsdale, NJ: Lawrence Erlbaum (1987) 1421
2. Deb, K. and Goldberg, D.E.: An Investigation of Niche and Species Formation in Genetic
Function Optimization. In: Schaffer, J.D. (eds.): Proceedings of the Third International
Conference on Genetic Algorithms and their Applications. San Mateo, CA: Morgan Kaufmann (1989) 4250
3. Eiben, E., Hiterding, R. and Michalewicz, Z.: Parameter Control in Evolutionary Algorithms. IEEE Transactions on Evolutionary Computation. Vol. 3, no.2, (1999) 124141
4. Goldberg, D.E.: Genetic Algorithms in Search, Optimization, and Machine Learning. New
York: Addison-Wesley (1989)
5. Goldberg, D.E. and Richardson J.: Genetic Algorithms with Sharing for Multimodal Function Optimization. In: Grefenstette, J.J. (eds.): Proceedings Of the Second International
Conference on Genetic Algorithms and Their Applications. Hillsdale, NJ: Lawrence Erlbaum (1987) 4149
6. Goldberg, D.E. and Wang, L.: Adaptive Niching via Coevolutionary Sharing. IlliGAL
Report No. 97007. (1997)
7. Mahfoud, S.W.: Crossover Interactions among Niches. In: Proceedings of the first IEEE
Conference on Evolutionary Computation. Piscataway, NJ: IEEE Press (1994) 188193
8. Mahfoud, S.W.: Genetic Drift in Sharing Methods, In: Proceedings of the First IEEE
Conference on Evolutionary Computation. Piscataway, NJ: IEEE Press (1994) 6772
9. Mahfoud, S. W.: Niching Methods for Genetic Algorithms. Ph.D. Dissertation, University
of Illinois, Urbana-Champaign (1995)
10. Miller, B.L. and Shaw, M. J.: Genetic Algorithms with Dynamic Niche Sharing for Multimodal Function Optimization. In: Proceedings of the third IEEE Conference on Evolutionary Computation. Piscataway, NJ: IEEE Press (1996) 786791
11. Petrowski, A.: A Clearing Procedure as a Niching Method for Genetic Algorithms. In:
Proceedings of the third IEEE Conference on Evolutionary Computation. Piscataway, NJ:
IEEE Press (1996) 798803
12. Sareni, B. and Krahenbuhl, L.: Fitness Sharing and Niching Methods Revisited. IEEE
Transactions on Evolutionary Computation. Vol. 2, no. 3, (1998) 97106
13. Yin, X. and Germay, N.: A Fast Genetic Algorithm with Sharing Scheme Using Cluster
Analysis Methods in Multimodal Function Optimization. In: Albrecht, R.F. (eds.): Proceedings of International Conference on Artificial Neural Nets and Genetic Algorithms.
New York, Springer-Verlag (1993) 450457
14. Yu, X. and Wang, Z.: The Fitness Sharing Genetic Algorithms with Adaptive Power Law
Scaling. System Engineering Theory and Practice. Vol. 22, no. 2, (2002) 4228

You might also like