You are on page 1of 4

International Congress Series 1269 (2004) 149 – 152

www.ics-elsevier.com

A solution to combinatorial optimization with


time-varying parameters by a hybrid
genetic algorithm
Hong Zhang *, Masumi Ishikawa
Department of Brain Science and Engineering, Graduate School of Life Science and Systems Engineering,
Kyushu Institute of Technology, 2-4 Hibikino, Wakamatsu, Kitakyushu 808-0196, Japan

Abstract. This paper proposes a hybrid genetic algorithm to improve the performance of the
standard genetic algorithm for solving combinatorial optimization problems with time-varying
parameters. The proposal includes two key-points: elitism and a non-redundant search. The former
copies the superior individuals to the next generation for improving convergence. The latter
improves the efficiency of search for finding the best individual. Based on the combination of these
advantages by a mixing function, the proposed algorithm with simple operators is well suited to
quickly acquire a sequence of optimal solutions corresponding to the changing environment.
Applications of the proposal to combinatorial optimization problems with time-varying parameters
demonstrate well its effectiveness. D 2004 Elsevier B.V. All rights reserved.

Keywords: Combinatorial optimization; Genetic algorithm; Elitism; Non-redundant search

1. Introduction
Genetic algorithms (GA), evolutionary computation proposed by Hollzand [2], have
been applied to real world problems in many disciplines. Recent studies have shown its
usefulness in contrast to conventional search and optimization methods for solving
problems with complex and ill-posed criteria. In spite of simple operators used, the standard
genetic algorithm (SGA) [1] is good at solving difficult optimization problems [3].
This paper proposes to solve combinatorial optimization problems with time-varying
parameters by a hybrid genetic algorithm. Here, a combinatorial optimization problem
with time-varying parameters is to search a sequence of optimal solutions under criteria
and constraints with time-varying parameters, which represent the changing environment.
An important issue is how to quickly find out the optimal solution, and adaptively change
it in accordance with the changing environment. Although the SGA has advantages of
adaptability and robustness, it still has disadvantages such as no guarantee for convergence

* Corresponding author. Tel./fax: +81-93-695-6112.


E-mail address: zhang@brain.kyutech.ac.jp (H. Zhang).

0531-5131/ D 2004 Elsevier B.V. All rights reserved.


doi:10.1016/j.ics.2004.05.019
150 H. Zhang, M. Ishikawa / International Congress Series 1269 (2004) 149–152

and instability of computation. These greatly affect the results of combinatorial optimi-
zation with time-varying parameters.
To improve the performance of the SGA, mechanisms such as restricted mating, which
restricts mating only with neighbouring individuals, and a heuristic algorithm have been
implemented into the SGA [4,5]. Although these approaches are effective, they are not
suited to combinatorial optimization problems with time-varying parameters due to heavy
computation.
To overcome these difficulties, we propose a hybrid genetic algorithm (HGA). Basic
ideas are the elitism for improving convergence and a non-redundant search for an efficient
search. Here, non-redundant means to exclude the individuals with the same chromosome
in the current population. By adding new individuals selected randomly, instead of these
redundant ones deleted from the population, a non-redundant search improves the
efficiency of search for finding the best individual.
The combination of these advantages using a mixing function, which adjusts the
proportion of the best individuals to the non-redundant individuals, helps to find the global
optimal solution. Since the operators such as selection, comparison and reordering are
simple, the HGA is well suited to acquiring a sequence of optimal solutions corresponding
to the changing environment.

2. Combinatorial optimization problem with time-varying parameters


Consider the following combinatorial optimization problem with time-varying
parameters:

X
N
ðkÞ
X
N
ðkÞ ðkÞ
max fk ðxi Þ ¼ vi xi ; subj: to hk ðxi Þ ¼ wi xi  wb V0; 1VkVK; xi af0; 1g
fxi g
i¼1 i¼1

where fk() is the criterion function and hk() is the constraint condition at time k,
respectively. By incorporating the constraint, the following proxy criterion function is
used to evaluate the individual, x.

gk ðxi Þ ¼ fk ðxi Þ  a max½0; hk ðxi Þ

where a is a penalty parameter. Our greatest concern here is how to efficiently acquire
the optimal solution, xo, at each time k.

3. Hybrid genetic algorithm


Fig. 1 shows the flow chart of the HGA, which improves convergence and enhances
search capability. The HGA uses the following operators in addition to those in the SGA.

Operator 1 Adjustment of the order of the individuals in the current population Pt based
on the fitness.
Operator 2 Deletion of redundant individuals from the population PtV, which are generated
by the SGA and addition of new individuals selected randomly. Let the
resulting population be PtW.
H. Zhang, M. Ishikawa / International Congress Series 1269 (2004) 149–152 151

Fig. 1. Flow chart of the proposed hybrid genetic algorithm.

Operator 3 Determination of a mixing function, m(t). Merge the best m(t)% individuals
from the population Pt and (1  m(t))% individuals from the population PtW
for generating the next generation Pt + 1.

ðmT  m1 Þ
mðtÞ ¼ ðt  1Þ þ m1 ð1Þ
ðT  1Þ
where the parameters, m1 and mT, in Eq. (1) satisfy 1 z m1 z mT z 0, and T is
the number of generations at each time k.

4. Computer experiments
Table 1 indicates the parameters in a combinatorial optimization problem with time-
varying parameters. Experimental parameters are as follows: population size, 100; the num-
ber of generations, T = 10; selection, roulette wheel selection; crossover, one-point crossover
( pc = 0.75); mutation, bit-wise mutation ( pm = 0.1); and penalty parameter: a = 100.
Fig. 2(a) shows the frequency distribution of the best individual (optimal solution)
when the parameters of the mixing function, m1 and mT, are constant. We compare the

Table 1
An example of a combinatorial optimization problem with time-varying parameters
k i 1 2 3 4 5 6 7 8 9 10 b
1 vi(1) 12 3 10 6 2 12 4 7 5 11
wi(1) 5 3 6 7 3 8 7 2 3 5 25
2 vi(2) 6 2 12 3 10 12 7 4 5 11
wi(2) 7 3 5 3 6 8 2 7 3 5 25
3 vi(3) 6 2 14 3 10 12 7 4 5 11
wi(3) 8 2 7 3 5 7 3 5 3 6 18
4 vi(4) 6 2 12 3 10 12 7 4 5 11
wi(4) 8 2 7 3 5 7 3 5 3 6 25
5 wi(5) 5 14 8 13 2 12 4 7 5 11
wi(5) 6 9 10 7 5 3 2 6 11 8 25
6 vi(6) 12 3 10 6 2 12 4 7 5 11
wi(6) 5 3 6 7 3 8 7 2 3 5 25
vi(k) and wi(k) are parameters in the criterion functions and constraints, respectively.
152 H. Zhang, M. Ishikawa / International Congress Series 1269 (2004) 149–152

Fig. 2. (a) Frequency distribution of the best individual when m(t) is constant. (b) Fitness values corresponding to
each generation. (c) The best individual corresponding to each generation.

performance between the HGA and the SGA for k =1. The success rate of HGA is 96.5%
compared with 65.5% of the SGA, and the speed of search by the HGA is faster than that
by the SGA by 1.3 times. Similar results can be obtained for k p1. It is clear that the non-
redundant search not only prevents the evolutionary stagnation, but also generates the
global optimal solution quickly. The ratio of redundant individuals is 15 –20%.
Fig. 2(b) and (c) shows a successful search process of the HGA for the combinatorial
optimization problem with time-varying parameters in Table 1. The sequence of optimal
solutions is (1110000111, 0011101011, 0010101010, 0010111010, 0101010100,
1110000111). A change in the environment can be detected by a sudden decrease in the
mean of fitness. In that case, individuals in the current population are replaced by a new
population to prevent evolutionary stagnation.

5. Conclusions
We have proposed a hybrid genetic algorithm with the elitism and non-redundant
search. Applications of the proposed HGA to combinatorial optimization problems with
time-varying parameters demonstrate well its effectiveness. Obtained results indicate that
the HGA performs (search speed 1.3 times faster) better than the SGA. It is obvious that
the possibility of success (increase from 65.5% to 96.5%) is increased owing to the
diversity of the population. Because the given problem in computer experiments is simple,
it is left for further study to apply the proposed method to real world problems.

References
[1] D.E. Goldberg, Genetic Algorithms in Search, Optimization, and Machine Learning, Addison-Wesley, Bos-
ton, USA, 1989.
[2] J.H. Holland, Adaptation in Natural and Artificial Systems, Reprinted. MIT press, Cambridge, USA, 1992.
[3] K.F. Man, K.S. Tang, S. Kwong, Genetic Algorithm, Springer-Verlag, London, UK, 1999.
[4] D. Thierens, Scalability problems of simple genetic algorithms, Evol. Comput. 7 (4) (1999) 331 – 352.
[5] M. Ito, M. Sugisaka, A study of selection operator using correlation between individuals in genetic algorithm,
IEEJ Trans. EIS 124 (1) (2004) 170 – 175 (in Japanese).

You might also like