You are on page 1of 23

34

CHAPTER 3

HARMONY SEARCH ALGORITHMS

3.1 INTRODUCTION

Optimization deals with finding the best solution(s) for a given


problem, with the goal of minimizing or maximizing the given fitness
functions. Optimization problems are in fields such as engineering, planning,
scheduling, medicine, and computer science.

Over the years, many population-based methods inspired from


natural or biological evolution processes have been applied to solve many real
world problems. Moreover in the last decade, population-based methods were
proven to be valuable for solving optimization problems because such
methods deal with a population of solutions scattered over the entirety of a
search space. This feature helps population-based methods keep track of
changes by assigning each solution from the population to a different area in
the search space.

Population-based meta-heuristics methods, such as genetic


algorithm (GA), particle swarm optimization (PSO) and ant colony
optimization (ACO) , explore the entire search space using a set of solutions
and invoke equal number of solutions at the end of each iteration. Due to their
general structures, problem and model independent properties and the
capability of quickly obtaining promising areas there has been an increasing
35

interest in solving various optimization problems using population-based


meta-heuristics methods in the past decades.

Harmony search (HS) is a new meta-heuristic algorithm developed


by Geem et al (2001) which is inspired by the natural musical performance
process that occurs when a musician searches for a better state of harmony. In
the HS algorithm, the solution vector is analogous to the harmony in music,
and the local and global search schemes are analogous to musicians
improvisations. In comparison to other meta-heuristics in the literature, the
HS algorithm imposes fewer mathematical requirements and can be easily
adapted for solving various kinds of engineering optimization problems.
Furthermore, numerical comparisons demonstrated that the evolution in the
HS algorithm was faster than genetic algorithms. Therefore, the HS algorithm
has attracted much attention and has been successfully applied to solve a wide
range of practical optimization problems. Geem et al (2002) applied harmony
search algorithm for the application of pipe network design. Geem et al.
(2005) introduced harmony search algorithm to vehicle routing problem
Geem et al (2009) proposed harmony search optimization to the pump
included water distribution network design.

3.2 HARMONY SEARCH ALGORITHMS

3.2.1 Basic Harmony Search Algorithm (HSA)

The basic Harmony Search algorithm (HSA) is a metaheuristic


optimization algorithm inspired by the playing of music. It uses rules and
randomness to imitate natural phenomena. Inspired by the cooperation within
an orchestra, the HSA achieves an optimal solution by finding the best
harmony among the system components involved in a process. Just as
discrete musical notes can be played based on a players experience or on
random processes in improvisation. The optimal design variables in a system
36

can be obtained with certain discrete values based on computational


intelligence and random processes. Musicians improve their experience based
on aesthetic standards, whereas design variables can be improved based on an
objective function.

In HSA each solution is called as harmony and represented by an


n-dimension real vector. An initial population of harmony vectors are
randomly generated and stored in a harmony memory (HM). Then a new
candidate harmony is generated from all of the solutions in the HM by using a
memory consideration rule, a pitch adjustment rule and a random re-
initialization. Finally, the HM is updated by comparing the new candidate
harmony and the worst harmony vector in the HM. The worst harmony vector
is replaced by the new candidate vector if it is better than the worst harmony
vector in the HM. The above process is repeated until a certain termination
criterion is met. The HSA consists of three basic phases, namely,
initialization, improvisation of a harmony vector and updating the HM, which
are described below respectively.

Step 1: Parameter initialization

Consider an optimization problem that is described by

() , = 1,2, . (3.1)

where F(x) is the objective function, x is the set of design variables ,Xi is the
range set of the possible values for each design variable. The following HS
algorithm parameters are also specified. The harmony memory size(HMS) or
number of solution vectors in the harmony memory; the harmony memory
considering rate (HMCR); the pitch adjusting rate (PAR); the number of
decision variables (N); the number of improvisations (NI) and the stopping
criterion.
37

Step 2 Harmony memory initialization

The harmony memory (HM) matrix shown in (3.2) is filled with


randomly generated solution vectors for HMS and sorted by the values of
objective function f(x).

11 21 1
.. 1 1 ( (1) )
12 22 2
.... 1 2 ( (2) )
= .. ..
.... .. .
. .. (3.2)
11 21 ..1
1 1
( (1) )
[ 1 2 1
] ( () )

Step 3 New Harmony improvisations

A new harmony vector 1 = (1,1 2,


1
. . , 1 ) is generated based on
three criteria: memory consideration, pitch adjustment and random selection.
Generating a new harmony is called improvisation. The HMCR, which varies
between 0 and 1is the rate of choosing one value from the historical values
stored in the HM, while (1-HMCR) is the rate of randomly selecting one
value from the possible range of values as shown in (3.3)

( ( ) < )

{1 , 2 , . . }

(3.3)

where ( )is a uniformly distributed random number between 0 and 1 and


is the set of the possible range of values for each decision variable. Every
38

component obtained with memory consideration is examined to determine if


pitch is to be adjusted. This operation uses the rate of pitch adjustment as a
parameter as shown in the following:

( ( ) < )

= ( )

else

end (3.4)

where is an arbitrary distance bandwidth for the continuous design


variable and ( ) is uniform distribution between -1 and 1.

Step 4: Update harmony memory

If the new harmony vector = (1 , 2 , . . ) has better fitness


function than the worst harmony in the HM, the new harmony is included in
the HM and the existing worst harmony is excluded from the HM.

Step 5: Checking the stopping criterion

If the stopping criterion, which is based on the maximum number of


improvisations, is satisfied, the computation is terminated. Otherwise, step 3
and 4 are repeated.

3.2.2 The Improved Harmony Search Algorithm (IHSA)

The Improved Harmony Search Algorithm (IHSA) was developed


by Mahdavi et al (2007) and has been successfully applied to various
39

benchmarking tests and standard engineering optimization problems.


Numerical results have proven that the improved algorithm can find better
solutions than the basic HSA and other heuristic or deterministic methods.
The key difference between the IHSA and traditional HSA is in the manner
by which PAR and bw are adjusted. The IHSA uses variable PAR and bw
values in the improvisation step to improve the performance of the HSA and
eliminate the drawbacks associated with using fixed PAR and bw values. The
PAR values change dynamically with the generation number and expressed as
follows:

( )
() = + (3.5)

where is the pitch adjustment rate for each generation, is the


minimum pitch adjustment rate, is the maximum pitch adjustment
rate, NI is the number of solution vector generations, and is the
generation number. changes dynamically with the generation number and
is defined as follows

() = exp(. ) (3.6)


( )

= (3.7)

where () is the bandwidth at each generation, and and


are the minimum and maximum bandwidths respectively.

3.3 STUDY ON BENCHMARK FUNCTIONS

3.3.1 Description of Benchmark Functions

To test the performance of the proposed harmony search algorithms,


an extensive experimental evaluation are provided based on a set of 13 global
optimization problems as follows:
40

Sphere function (1 )

Sphere function is defined as


(3.8)
() = 2 ()
=1

where global optimum = 0 and ( ) = 0 for 100 () 100.

Schwefels problem 2.22 (2 )

Schwefels problem 2.22 is defined as


(3.9)
() = |()| + |()|
=1 =1

where global optimum = 0 and ( ) = 0 for 10 () 10.

Rosenbrock function (3 )

Rosenbrock function is defined as

1
2 (3.10)
() = (100( ( + 1) 2 ()) + ( () 1)2 )
=1

where global optimum = (1,1, . .1) and ( ) = 0 for 30 () 30.

Step function (4 )

Step function is defined as


41


(3.11)
() = (| () + 0.5|)2
=1

where global optimum = 0 and ( ) = 0 for 100 () 100.

Rotated hyper-ellipsoid function (5)

Rotated hyper-ellipsoid function is defined as


(3.12)
() = ( ())2
=1 =1

where global optimum = 0 and ( ) = 0 for 100 () 100.

Schwefels problem 2.26 (6 )

Schwefels problem 2.26 is defined as

() = 418.9829 ( () sin (| ()|)) (3.13)


=1

where global optimum = (420.9687,420.9687, ,420.9687 and


( ) = 0 for 500 () 500.

Rastrigin function (7 )

Rastrigin function is defined as


(3.14)
() = ( 2 () 10 cos(2 ()) + 10)
=1
42

where global optimum = 0 and ( ) = 0 for 5.12 () 5.12.

Ackleys function (8)

Ackleys function is defined as


1 1
() = 20 exp (0.2 2 ()) ( cos(2())) + 20 +

=1 =1

(3.15)

where global optimum = 0 and ( ) = 0 for 32 () 32.

Griewank function (9 )

Griewank function is defined as


1 ()
() = 2 () cos ( )+1 (3.16)
4000
=1 =1

where global optimum = 0 and ( ) = 0 for 600 () 600.

Six-hump Camel-back function (10 )

Six-hump camel-back function is defined as


1
() = 4 2 (1) 2.1 4 (1) + 6 (1) + (1) (2) 4 2 (2) + 4 4 (2)
3

(3.17)

where global optimum = (0.08983,0.7126) and ( ) = 1.0316285


for 5 () 5.
43

Shifted Sphere function(11 )

Shifted sphere function is defined as

() = 2 () + _1 (3.18)
=1

where = ; = {(1), (2), . . ()} is the shifted global optimum


= and ( ) = 1 = 450 for 100 () 100

Shifted Schwefels problem 1.2(12 )

Shifted Schwefels problem is defined as

() = ( () ) + _2 (3.19)
=1 =1

where = ; = {(1), (2), . . ()} is the shifted global optimum


= and ( ) = 2 = 450 for 100 () 100.

Shifted Rosenbrock function(13 )

Shifted Rosenbrock function is defined as

1
2
() = (100(( + 1) 2 ()) + (() 1)2 ) + _6 (3.20)
=1

where = ; = {(1), (2), . . ()} is the shifted global optimum


= and ( ) = 6 = 390 for 100 () 100.

Among the above 13 benchmark problems, Sphere function,


Schwefels problem 2.22, Step function, Rotated hyper-ellipsoid function,
44

Shifted Sphere function and Shifted Schwefels problem 1.2 are unimodal.
Step function is discontinuous. Rosenbrock function, Schwefels problem
2.26, Rastrigin function, Ackley function, Griewank function, Shifted
Rosenbrock function are difficult multimodal problems where the number of
local optima increases with the problem dimension. Six-hump Camel-back
function is a low-dimensional function with only a few local optima.

3.3.2 Comparison of harmony search algorithms with GA and PSO

In order to have fair comparison the harmony search algorithms are


compared with Genetic Algorithm (GA) and Particle Swarm Optimization
(PSO) in solving the described benchmark function with different dimensions.
For each function (except 10), optimizations are conducted by GA,PSO,HSA
and IHSA on three different dimension sizes and they are 30, 50 and 100.
Thirty independent replications are carried out for each function and the
number of improvisations (NI) for each run is set to 50000. Table 3.1 and
Table 3.2 represents the parameter setting of GA and PSO. Table 3.3
represents the parameter setting of harmony search algorithms for solving
benchmark functions. The average and standard deviations (SD) generated by
the four algorithms (GA, PSO,HSA and IHSA) with different dimensions are
reported in Table 3.4, 3.5 and Table 3.6 respectively. It can be seen from the
Tables, that the harmony search algorithms generates best results for 13
functions than GA and PSO. The convergence diagram of rosenbrock
function with dimension 30 for GA,PSO,HSA and IHSA is shown in figure
3.1. From the figure it is clear that the harmony search algorithms converges
quickly towards optimal point than GA and PSO.
45

Table 3.1 Parameter Setting for GA

Parameters Values

Number of generations 100


Population Size 50
Crossover Rate 0.9
Mutation Rate 0.05

Table 3.2 Parameter Setting for PSO

Parameters Values

Population size 50
Max iteration 100
wmin 0.4
wmax 0.9
C1=C2
1.4
Velocity bounds
(-3,7)

Table 3.3 Parameter settings of harmony search algorithms

Algorithm HMS HMCR PAR bw NI


HSA 5 0.9 0.3 0.01 5000
IHSA 5 0.95 PARmin=0.35 bwmax=(UB-LB)/20 5000
PARmzx=0.99 bwmin=0.000001

where UB and LB are the upper and lower bounds of the design variable.
46

Table 3.4 Mean and standard deviation of the benchmark function optimization results (n=30)

Function Global GA PSO HSA IHSA


optimum
0 93.478 194.57 71.356 58.781 7.711433 3.307032 0.000000 0.000000

0 85.879 191.32 63.757 55.533 0.112437 0.059248 0.000009 0.000001

0 390.13 705 368 569.21 304.359111 513.738959 93.636178 80.553169

0 98.267 196.22 76.145 60.434 12.500000 4.960186 0.033333 0.182574

0 4656.5 1816.3 4634.4 1680.5 4570.725435 1625.045376 1841.741864 711.620590

0 111.84 200.2 89.72 64.42 0 26.074848 8.945656 0.000382 0.000000

0 86.467 191.96 64.345 56.176 0.699759 0.701654 0.230684 0.442564

0 86.795 191.66 64.673 55.877 1.028156 0.402630 0.000002 0.000000

0 86.849 191.29 64.727 55.503 1.082123 0.028610 0.002957 0.004816

-450 -357.97 193.42 -380.1 57.633 -443.740735 2.158973 -450.000000 0.000000

-450 4388.1 1716.8 4365.9 1581 4302.284478 1525.507736 2325.791439 1170.418260

390 4895.21 2841 4550.12 2684.21 4399.666642 2712.180 1852.716453 3372.3623


47

4
x 10
5
HSA
4.5 IHSA
GA
4 PSO

3.5
Objective function

2.5

1.5

0.5

0 100 200 300 400 500 600 700 800 900 1000
No of Iterations

Figure 3.1 Convergence of rosenbrock function (n=30) for GA,PSO,HSA


and IHSA

3.3.3 Robustness Analysis

In order to evaluate the robustness of the IHSA, we further measure


the convergence speed and the success rate of the two algorithms. For this
purpose, we set the NI as 50,000 and run each of the five algorithms
independently with 50 times in 30 dimensions. Each algorithm will stop as
soon as its best error value (BEV) falls below the predefined threshold value
or the maximum number of NI is exceeded. The final searching quality within
the given threshold value is called a success run (SR). The number of the
success runs (among 50 runs) and the iteration number the algorithm takes are
recorded. Table 3.5 reports the number of SR (SRN) and the average iteration
numbers (AIN) of the five algorithms when solving the 13 benchmark
problems. From Table 3.5 it can be easily seen that the IHSA can obtain good
results with higher SR number for most of the benchmark functions even with
48

small average iteration number. That is to say, IHSA has stronger ability and
faster convergence speed to find better solutions than the compared HSA.
Hence, it can be concluded that IHSA is much effective and reliable for
solving global optimization problems.

3.3.4 Effect of HMS, HMCR and PAR

In this section, the effects of the HMS, HMCR and PAR on the
performance of the IHSA are examined. The averages and SD obtained by
using different HMS, HMCR and PAR values for the number of dimensions n
= 30 are reported in Tables 3.6 to Table 3.8 respectively. It can be found
from Tables 3.6 that for most benchmark functions, a small value of HMS
(i.e., 5 or 10) is superior to a large value (i.e., 20 or 50). This can be explained
by the basic principle of the original HSA. Since HM is emulating musicians
short-term memory and the short-term memory of the human is known to be
small, it is logical to use a small HMS. From Table 3.7, it can be seen that the
performance of the IHSA degrades with the HMCR value decreasing for most
of the test functions. Although a small HMCR value can increase the diversity
of the harmony memory, a much slower convergence rate is obtained
accordingly. Hence, it is a good choice to use a large value for the HMCR
(i.e. HMCR > = 0.95). The effect of using different constant values of PAR
on the performance of IHSA Table 3.8 is investigated. According to
Table 3.8, there is no one setting of PAR is better than the other settings. It is
generally better to use a PAR value between 0.3 and 0.7.
49

Table 3.5 Mean and standard deviation of the benchmark function optimization results (n=50)

Function Global GA PSO HSA IHSA


optimum
0 894.640 640.77 586.040 445.95 494.137 76.155 0.000 0.002

0 409.850 566.12 101.250 371.31 9.345 1.513 0.066 0.078

0 25544.00 10485.00 25235.00 10290.00 25143.520 9920.545 413.606 377.962

0 902.730 677.74 594.140 482.92 502.233 113.127 6.967 3.429

0 29324.00 5500.20 29016.00 5305.30 28923.658 4935.547 17093.321 4002.401

0 1252.800 707.43 944.210 512.61 852.308 142.818 14.557 7.911

0 440.410 569.66 131.810 374.85 39.911 5.052 7.019 2.094

0 405.730 565.01 97.132 370.19 5.229 0.399 0.722 0.408

0 406.290 565.39 97.687 370.57 5.784 0.774 0.126 0.130

-450 505.260 673.30 196.660 478.48 104.755 108.687 -449.998 0.010

-450 34621.00 7048.10 34312.00 6853.30 34220.173 6483.534 19456.199 4095.018

390 2472150
1344100.0 2368300 1343900.0
2368161.2 1343572.9
4300.483 4872.476
27 72
50

Table 3.6 Mean and standard deviation of the benchmark function optimization results (n=100)
Function Global GA PSO HSA IHSA
optimum
0 20003.000 2219.600 19181.000 1987.400 19166.359 1969.024 19239.760 1954.350

0 912.940 256.620 90.057 24.370 75.854 6.008 75.607 6.042

0 14455000 2747700.00 14454000 2747400.00


14454392.0 2747404.3 14582465.36 3178620.9
73 62 9 57

0 20983.000 2100.700 20160.000 1868.500 20145.900 1850.092 19569.767 2465.698

0 204240.00 28128.000 203420.00 27895.000 203400.956 27877.029 214176.334 20067.399

0 8250.700 878.600 7427.800 646.350 7413.598 627.991 7109.127 813.127

0 1150.500 272.610 327.610 40.364 313.402 22.002 303.182 25.858

0 850.540 250.940 27.658 18.693 13.455 0.331 13.232 0.323

0 1021.500 271.220 198.640 38.975 184.435 20.613 177.408 17.744

-450 22909.000 3431.500 22087.000 3199.300 22072.331 3180.894 22190.071 2848.582

-450 283900.00 35068.000 283070.00 34836.000 283059.962 34817.210 276530.616 33757.759

390 22737000 402930000. 22737000


402930000
2273697063 40293069 2179917319. 467726888
00 000 00 .885 4.81 634 .663
51

Table 3.7 Number of successful runs and mean number of iterations


(n=30)

Function BEV HSA IHSA


SRN 0 50
10-5
AIN -- 28866.56
SRN 0 0
10-5
AIN
SRN 0 27
30
AIN -- 7624.16
SRN 0 50
10-5
AIN -- 12630.82
SRN 0 0
10-5
AIN --
SRN 0 49

10-3 AIN -- 21443.9
SRN 0 11

10-5 AIN -- 7553.06
SRN 0 50
-5
10 AIN -- 45877.82
SRN 0 29
-5
10 AIN 15657.92
SRN 50 50
-5
10 AIN 765.64 822.02
SRN 0 50

10-4 AIN -- 25299.84
SRN 0 0
10
AIN --
SRN 0 14

102 AIN -- 4428.2
Table 3.8 The effect of HMS (n=30)

Function HMS = 5 HMS = 10 HMS = 20 HMS = 50


0.000000 (0.000000) 0.000000 (0.000000) 0.000000 (0.000000) 0.000000 (0.000000)
0.000009 (0.000001) 0.000009 (0.000001) 0.000009 (0.000001) 0.000009 (0.000001)
93.636178(80.553169) 42.435178(70.513148) 85.51278(82.513158) 75.4268(62.44581)
0.033333(0.182574) 0.05347(0.172465) 0.04447(0.163565) 0.05477(0.1545665)
1841.741864(711.620590) 1645.641764(612.520570) 1754.551764(512.420570) 1784.54545(554.455670)
0.000382 (0.000000) 0.000174 (0.000000) 0.000278 (0.000000) 0.000379 (0.000000)
0.230684(0.442564) 0.170841(0.442564) 0.2270952(0.543564) 0.288502(0.5335461)
0.000002 (0.000000) 0.000013 (0.000000) 0.000009(0.000027) 0.0000012(0.000022)
0.002957 (0.004816) 0.005937 (0.005854) 0.003973 (0.006844) 0.003071 (0.005841)
-1.031628(0.000000) -1.031628(0.000000) -1.031628(0.000000) -1.031628(0.000000)
-450.000000(0.000000) -448.000000(0.000035) -449.000000(0.000021) -450.000000(0.000000)
2325.791439(1170.418260) 2125.871459(1260.42750) 2355.681438(1570.32840) 2335.78328(1650.34850)
1852.716453(3372.362369) 1564.615452(3572.351367) 1664.635552(2572.551465) 1769.635552(5662.541435)

52
Table 3.9 The effect of HMCR (n=30)

Function HMCR =0.95 HMCR = 0.9 HMCR = 0.7 HMCR = 0.5


0.000000 (0.000000) 0.000000(0.000000) 3744.536350(906.599078) 13556.322820 (2550.971246)
0.000009 (0.000001) 0.000000(0.000000) 11.993721(2.974125) 40.682989(3.830062)
93.636178(80.553169) 63.784702(105.869110) 842763.629220(290588.317816) 13687057.702685(4174373.254273)
0.033333(0.182574) 0.000000(0.000000) 3552.200000(716.300861) 13718.133333(1987.669919)
1841.741864(711.620590) 1798.206605(696.303869) 17032.819945(4825.347102) 30760.104344(4567.842188)
0.000382 (0.000000) 0.605431(0.959375) 2974.934874(412.832043) 4500.327163(278.796555)
0.230684(0.442564) 0.000000(0.000000) 63.902293(15.660358) 181.501136(14.605500)
0.000002 (0.000000) 0.000000(0.000000) 15.209930(0.857457) 16.636931(0.419122)
0.002957 (0.004816) 0.100886(0.061028) 35.950873(7.402633) 121.013714(22.226050)
-1.031628(0.000000) -1.031628(0.000000) -1.031628(0.000000) -1.031628(0.000000)
-450.000000(0.000000) -489.952358(0.059210) 6448.659140(651.767740) 77002.054351(4336.061807)
2325.791439(1170.418260) 1852.320437(607.073944) 59570.780646(4329.426082) 47614.941048(6902.848968)
1852.716453(3372.362369) 2876.950353(3079.642256) 551749175.716647 3666744057.810880
(67649026.287723) (714559776.999992)

53
Table 3.10 The effect of PAR (n=30)

Function PAR =0.01 PAR =0.1 PAR =0.3 PAR =0.5 PAR =0.7 PAR =0.9
0.000000 0.000000 0.000000 0.000000 0.000000 0.000000
(0.000000) (0.000000) (0.000000) (0.000000) (0.000000) (0.000000)
0.000000 0.000000 0.000000 0.000000 0.000000 0.000000
(0.000000) (0.000000) (0.000000) (0.000000) (0.000000) (0.000000)
282.653364 107.440357 32.833197 30.660391 25.328236 29.568903
(489.187424) (106.216073) (15.307607) (13.342002) (0.313470) (0.233126)
0.000000 0.000000 0.000000 0.000000 0.000000 0.000000
(0.000000) (0.000000) (0.000000) (0.000000) (0.000000) (0.000000)
986.842377 0.096724 0.000000 0.000000 0.000000 0.000000
(407.467545) (0.053309) (0.000000) (0.000000) (0.000000) (0.000000)
0.000747 0.000974 0.026382 0.027708 0.009598 0.0023337
(0.000123) (0.000219) (0.006004) (0.006397) (0.004128) (0.001134)
0.000000 0.000000 0.000000 0.000000 0.000000 0.000000
(0.000000) (0.000000) (0.000000) (0.000000) (0.000000) (0.000000)

54
Table 3.10 (Continued)

Function PAR =0.01 PAR =0.1 PAR =0.3 PAR =0.5 PAR =0.7 PAR =0.9
0.000000 0.000000 0.000000 0.135848 0.073089 0.000000
(0.000000) (0.000000) (0.000000) (0.493023) (0.415800) (0.000000)
0.035402 0.041795 0.010084 0.003705 0.001881 0.000000
(0.036172) (0.035617) (0.017694) (0.007580) (0.006472) (0.000000)
-1.031628 -1.031628 -1.031628 -1.031628 -1.031728 -1.031628
(0.000000) (0.000000) (0.000000) (0.000000) (0.000000) (0.000000)
-459.999882 -449.899808 -449.782512 -449.922743 -449.08060 -448.999535
(0.000055) (0.000066) (0.001794) (0.002824) (0.000669) (0.000298)
2974.495515 -443.731060 -449.435561 -448.663221 -447.142620 178.454056
(1454.665369) (19.753425) (0.716192) (2.989611) (5.355981) (369.538765)

3481.013351 2846.632138 2103.750289 1726.793384 2208.996622 3565.344735


(3795.853754) (3649.082203) (2914.275497) (2297.673258) (2526.091488) (4504.438601)

55
56

3.4 CONCLUSION

In this chapter, optimization algorithms like Genetic Algorithm


(GA), Particle Swarm Optimization (PSO), Harmony Search algorithm (HSA)
and Improved Harmony Search Algorithm (IHSA) were presented to solve
thirteen benchmark continuous optimization problems. Performance analysis
of harmony search algorithms with GA and PSO were presented. The effects
of HMS,HMCR and PAR on the solutions of optimization were also
discussed. Simulation results shows that the harmony search algorithms were
more effective in finding better solution than GA and PSO.

You might also like