You are on page 1of 68

Particle Swarm optimisation: A mini tutorial

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

The inventors (1)


Russell Eberhart
eberhart@engr.iupui.edu

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

The inventors (2)

k r o w t a Jim

James Kennedy
Kennedy_Jim@bls.gov
20041215 Particle Swarm optimisation Maurice.Clerc@WriteMe.com

Part 1: United we stand

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

Cooperation example

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

Memory and informers


It is thanks to these eccentrics, whose behaviour is not conform to the one of the other bees, that all fruits sources around the colony are so quickly found. Karl von Frish 1927

P N P D
20041215 Particle Swarm optimisation Maurice.Clerc@WriteMe.com

Initialisation. Positions and velocities

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

Neighbourhoods

geographical
20041215 Particle Swarm optimisation

social
Maurice.Clerc@WriteMe.com

Psychosocial compromise
y t i m i x ipro
Here I am! x p g My best perf. The best perf. of my neighbours

v
20041215 Particle Swarm optimisation

y t i m i x o r gp
Maurice.Clerc@WriteMe.com

The historical algorithm


At each time step t for each particle for each component d

update v ( t ) d the velocity + rand ( 0, 1 ) pd xd ( t ) then move x(t + 1) = x(t ) + v(t + 1)


20041215 Particle Swarm optimisation

Randomness inside the vd ( t + 1) = loop


+ rand ( 0, 2

( )( g

xd

) (t ))

Maurice.Clerc@WriteMe.com

Oscillations
Initial v

1 3

Fitness

2 4
Search space

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

The circular neighbourhood


Particle 1s 3neighbourhood
7 1 8 2

Virtual circle
6 5
20041215 Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

Random proximity
Hyperparallelepiped => Biased
y t i m i x ipro
p x g v
20041215 Particle Swarm optimisation Maurice.Clerc@WriteMe.com

d i m a r y p n a y a M = DPNP

y t i m i x o gpr

Animated illustration
Global optimum

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

Maths and parameters


The right wa y

This way

Or this way

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

Neighbourhoods (topologies)

h p a r g d l r Small wo

or
20041215 Particle Swarm optimisation

?
(informers)

Maurice.Clerc@WriteMe.com

t n e i c i f f e o c n o i t c i r t s n o c l Globa Usual values: v(t + 1) = (v(t ) + (q x(t ))) =1 x(t + 1) = v(t + 1) + x(t ) with =4.1 = rand (0, ) + rand (0, ) = ' + ' => =0.73 ' p + ' g q= swarm size=20 ' + ' hood size=3 2 for > 4 n o = 2 + 4 i r e t i r c e c n e g ver else Non di
1 2 1 2
1 2 1 2

Type 1 form

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

5D complex space
A 3D section

6 5 4 -2 -1 3 2 1 0 00 1 2

} Convergence
-4 -2

Non divergence

Re(v)
20041215

Re(y)

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

Move in a 2D section (attractor)


Im( v) 0.8 0.6 0.4 0.2 -1 -0.5 0 0 0.5 -0.2 -0.4 -0.6 -0.8 1 Re(v)

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

Beyond real numbers


1 0 0 1 1 2 3 4 0 1 2 3
20041215

2 3 4 5 6 0 1 2 0 1 2 3 4
Maurice.Clerc@WriteMe.com

2 3 4

Bingo! 8

Particle Swarm optimisation

Minimun requirements
Comparing positions in the search space H
( x, x' ) H H , ( f (x ) < f (x')) ( f (x ) f (x'))

Algebraic operators
(coefficient , velocity ) velocity o (velocity, velocity ) velocity ( position, position ) velocity ( position, velocity ) position

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

Pseudo code form


velocity = pos_minus_pos(position1, position2) velocity = linear_combin(,velocity1,,velocity2) position = pos_plus_vel(position, velocity)

algebraic operators

(position,velocity) = confinement(positiont+1,positiont)

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

Confinements
Frontiers (ex. : interval)

=>

Granularity

=>
20041215 Particle Swarm optimisation Maurice.Clerc@WriteMe.com

End of Part 1

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

Part 2: When the algo mutates

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

The PSO Family


1995 1998 2000 2002 2004

Empirical Empirical

Canonical Canonical

Adaptive Adaptive

Spherical, Spherical, Gaussian Gaussian Pivot Pivot TRIBES

Weighted Weighted Constraints Constraints Discrete Discrete Combinatorial Combinatorial Multi swarm Multi swarm

Hybride Hybride OEP 0, 1,

Multi objective Multi objective


20041215 Particle Swarm optimisation Maurice.Clerc@WriteMe.com

Unbiased random proximity


Hyperparallelepiped => Biased Hypersphere vs y t i m iproxi hypercube y t i m i x o gpr p
1.2 1

0.8

Volume

g v

0.6

0.4

0.2

0 1 3 5 7 9 11 13 15 17 19 21 23 25 27 29

Dimension 20041215 Particle Swarm optimisation Maurice.Clerc@WriteMe.com

The three balls


y t i m i x o ipr
p x g

y t i m i x gpro

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

Think locally, act locally


narchy

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

Adaptive coefficients s
rule y z z u f r o Crisp

v
The better I am the more I follow my own way

rand(0b)(p-x) The better is my best neighbour the more I tend to go towards him

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

Adaptive swarmrulessize
zzy Crisp or fu There has been enough improvement although I'm the worst

I try to kill myself

I'm the best but there has been not enough improvement

I try to generate a new particle

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

TRIBES and strategies


S E B I TR
Adaptive information links Adaptive proximity distributions (DPNP)

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

Energies: classical process


Rosenbrock 2D. Swarmsize=20, size=20, constant coefficients . Rosenbrock 2D. Swarm constant coefficients
1,00E+04 9,00E+03 8,00E+03 7,00E+03 6,00E+03 5,00E+03 4,00E+03 3,00E+03 2,00E+03 1,00E+03 0,00E+00 0 5 10 15 Potential energy Kinetic energy Swarm size 20 25

=0.00001
Number of evaluations after 2240 evaluations

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

Energies: adaptive process


Rosenbrock 2D. Adaptive swarm size, size, adaptive coefficients . Rosenbrock 2D. Adaptive swarm adaptive coefficients
1,00E+04 9,00E+03 8,00E+03 7,00E+03 6,00E+03 5,00E+03 4,00E+03 3,00E+03 2,00E+03 1,00E+03 0,00E+00 0 5 10 15 Potential energy Kinetic energy Swarm size 20 25

=0.00001
Number of evaluations after 1414 evaluations

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

The simplest PSO


Random informers 1 K=3
8 2

Pivot method

g x

6 5
20041215

y t i m i x gpro
Maurice.Clerc@WriteMe.com

Particle Swarm optimisation

End of Part 2

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

Part 3:Story of Optimisation

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

Classical results
Optimum=0, dimension=30 Best result after 40 000 evaluations
30D function Griewank [300] Rastrigin [5] PSO Type 1" Evolutionary algo.(Angeline 98) 0.003944 82.95618 0.4033 46.4689 1610.359
Maurice.Clerc@WriteMe.com

Rosenbrock [10] 50.193877


20041215 Particle Swarm optimisation

Some small problems

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

Fiftyfifty
1 = y t i r granula
xi 1...N i j xi x j D / 2 D xi xi = 1 D / 2+ 1

{ }

N=100, D=20. Search space: [1,N]D

105 evaluations: 63+90+16+54+71+20+23+60+38+15 = 12+48+13+51+36+42+86+26+57+79 (=450)


20041215 Particle Swarm optimisation Maurice.Clerc@WriteMe.com

Knapsack
x 1...N i i j xi x j xi = S i I , I = D, I { 1, N }

1 = y t i r a l u n a r g

{ }

N=100, D=10, S=100,

870 evaluations: run 1 => (9, 14, 18, 1, 16, 5, 6, 2, 12, 17) run 2 => (29, 3, 16, 4, 1, 2, 6, 8, 26, 5)
20041215 Particle Swarm optimisation Maurice.Clerc@WriteMe.com

Apple trees
Best position Swarm size=3
Evaluation n1 0 3 1 6 3 2 3 7 n2 n3 0 17 4 10 11 6

n1 n3 n2

7 6

f = (n1 n2 )2 + (n2 n3 )2
20041215 Particle Swarm optimisation Maurice.Clerc@WriteMe.com

Graph Coloring Problem


l e v s u l p pos
5 3 1 1 5 1 2 2 5 0

+
1 5 5

-1 0 2 4 1 4 2

-1 -1 -3

=
20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

The Tireless Traveller


Example of position: X=(5,3,4,1,2,6) Example of velocity: v=((5,3),(2,5),(3,1))

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

BR17, the movie

e c a p s h c r a e s d e r u t c Stru

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

Ecological niche
Fuzzy data Non linear Multiobjective

Dynamical, real time Heterogeneous

t1

t2

t3

t4

t5

t6

t7

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

End of Part 3

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

Part 4: Real applications


Medical diagnosis Industrial mixer

Electrical generator

Electrical vehicle

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

Applications (1)
Salerno, J. Using the particle swarm optimization technique to train a recurrent neural model. IEEE International Conference on Tools with Artificial Intelligence, 1997, p. 4549, 1997. He Z., Wei C., Yang L., Gao X., Yao S., Eberhart R. C., Shi Y., "Extracting Rules from Fuzzy Neural Network by PSO", IEEE IEC, Anchorage, Alaska, USA, 1998. Secrest B. R., Traveling Salesman Problem for Surveillance Mission using PSO, AFIT/GCE/ENG/01M-03, Air Force Institute of Technology, 2001. Yoshida H., Kawata K., Fukuyama Y., "A PSO for Reactive Power and Voltage Control considering Voltage Security Assessment", IEEE TPS, vol. 15, 2001, p. 12321239. Krohling, R. A., Knidel, H., and Shi, Y. Solving numerical equations of hydraulic problems using PSO. Proceedings of the IEEE CEC, Honolulu, Hawaii USA. 2002.

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

Applications (2)
Kadrovach, B.A., and Lamont G., A particle swarm model for swarm-based networked sensor systems, ACM symposium on Applied computing, Madrid, Spain, p. 918-924, 2002 Omran, M., Salman, A., and Engelbrecht, A. P. Image classification using PSO. Proceedings of the 4th Asia-Pacific Conference on Simulated Evolution and Learning 2002 (SEAL 2002), Singapore. p. 370-374, 2002. Coello Coello, C. A., Luna, E. H., and Aguirre, A. H. Use of PSO to design combinational logic circuits. LNCS No. 2606, p. 398-409, 2003. Onwubolu, G. C. and Clerc, M., "Optimal path for automated drilling operations by a new heuristic approach using particle swarm optimization," International Journal of Production Research, vol. 4, p. 473-491, 2004. Onwubolu G.C., TRIBES application to the flowshop scheduling problem, New Optimization Techniques in Engineering. Heidelberg, Germany, Springer: p. 517-536, 2004 20041215 Maurice.Clerc@WriteMe.com Particle Swarm optimisation .

Neuronal network
Test Ei ei,1 ei,2
. . .

Wanted output Si
Transfer functions

Real output S'i(t) si,1


. . .

s'i,1(t)
.. .

k ,m ( entry , xk ,m )

si,P

s'i,P(t)

ei,N
E = (E1 ...Enb _ tests ) X (t ) = (x k, m (t )) S = (S1 ...Snb _ tests ) ' ' ' S (t ) = (S1 (t )...Snb _ tests (t ))

1 1+ e
xk , m entry

Function to minimise

f (X ) = S S

'

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

To know more
THE site:
Particle Swarm Central, http://www.particleswarm.info Kennedy, J., R. Eberhart, et al. (2001). Swarm Intelligence, Morgan Kaufmann Academic Press.

Self advert

d r a w a C E T E E E I 5 0 0 2

Clerc M., Kennedy J., "The Particle Swarm-Explosion, Stability, and Convergence in a Multidimensional Complex space", IEEE Transaction on Evolutionary Computation, 2002,vol. 6, p. 58-73
20041215 Particle Swarm optimisation Maurice.Clerc@WriteMe.com

More self ad.


My PSO site: http://clerc.maurice.free.fr/pso/index.htm

If you read French


Clerc M., "L'optimisation par essaim particulaire. Principes et pratique", Herms, Techniques et Science de l'Informatique, 2002. Article de 25 p. Clerc M., L optimisation par essaims particulaires. http://www.editions-hermes.fr/fr/. Parution fvrier 2005
20041215 Particle Swarm optimisation Maurice.Clerc@WriteMe.com

PSO in the world

eXtended Particle Swarms (XPS) project


20041215 Particle Swarm optimisation Maurice.Clerc@WriteMe.com

Some open questions


to s a e d i l a c i emat h t a m w e ns N o i t c a r e t n i ticle r a p l e d o m
d e t h g i e e w v i t p a Ad ips h s n o i t rela

ches a o r p p a ial r o t a n i b c om r e t t e B 20041215 Particle Swarm optimisation

el d o m a t Me
Maurice.Clerc@WriteMe.com

Beat the swarm!


n o i t i s o p t n e r r u c Your

. f r e p t s e b r u Yo
20041215 Particle Swarm optimisation

f o . f r e p Best the swarm


Maurice.Clerc@WriteMe.com

APPENDIX

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

Canonical form
v ( t + 1) = v ( t ) + ( q x ( t ) ) x ( t + 1) = x ( t ) + v ( t + 1)

y ( t ) = q x ( t ) v(t + 1) 1

Eigen values e1 and e2 v (t + 1) = v (t ) + y(t) )y (t ) y(t + 1) = v(t ) + (


20041215 Particle Swarm optimisation

v(t ) y (t + 1) = 1 1 y (t )

Maurice.Clerc@WriteMe.com

Constriction
Constriction coefficients
2 2 ( ) ( ) ( ) + + + + 2 2 1 = 2 + 4 2 + ( )2 + 2 ( 2 ) + ( )2 2 = 2 + 4 2

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

Convergence criterion
1e1 < 1 1 < 1 2e2 < 1 2 e2 < 1
3.5 3 2.5 2 1.5 1 0.5 1 2

= 2e 2

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

Robustness
Performance map : NeededIterations(,)

Iter

f ( x1 , x2 ) =

100 x2 x12 + (1 x1 )
20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

Clusters and queens


Each particle is weighted by its perf. Dynamic clustering Centroids = queens = temporary new particles

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

Magic Square (1)


m1,1 M m D ,1 K mi , j K D M m D, D m1,

(m
i =1

D 1

i, j

mi +1, j

j =1

mi , j = x j + (i 1) D 1L N } mi , j { mi , j mk ,l
20041215

(
D j =1

D 1

mi , j mi , j +1 i =1

=0

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

Magic Square (2)


55 30 68 42 49 62 56 74 23 30 61 53 89 32 23 25 51 68 43 51 78 75 33 64 54 88 30 80 3 30 22 72 19 11 38 64 50 43 67 58 55 47 52 62 4

D=3x3, N=100 10 runs 13430 evaluations


27 96 39 73 40 49 62 26 74 22 70 58 40 75 35 88 5 57

10 solutions
18 25 59 32 53 17 52 24 26

65 28 64 63 55 39 29 74 54

50 65 68 69 42 72 64 76 43

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

Non linear system


2 2 x x + 1 2 1 = 0 sin (10 x1 ) x2 = 0

Search space [0,1]2

1 run 143 evaluations 10 runs 1430 evaluations

1 solution

3 solutions
Maurice.Clerc@WriteMe.com

20041215

Particle Swarm optimisation

Model fitting (ARMA +AIC)


Autoregressive Moving Average + Akaike's Information Criterion

y
i =0 i
N

t i

= j at j
j =0
t data

=
2

(y
i =1

yt ARMA N

f = n log + 2(n + m )
20041215

( )
2

us o u n e g o r e t he

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

A binary PSO C code


Information links are modified at random if there has been no improvement
// Pivot method ----------------------------------------// Works pretty well on some problems .. and pretty bad on some others P[s]=P_m[g]; // Initialise the new position of particle s // at the position of the best known around dist=log(D); // We suppose here D>=2

r=alea(1,dist); // Radius for DPNP for (k=0;k<r;k++)// Switch at random some bits { d=alea_integer(0,D-1); P[s][d]=1- P[s][d][d]; // Around g }

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

End of ANNEXE

20041215

Particle Swarm optimisation

Maurice.Clerc@WriteMe.com

You might also like