Professional Documents
Culture Documents
- In contrast to other methods, where the term iteration B. Fitness evaluation and termination criteria
generally refers number of fitness evaluations, which is Every candidate solution is evaluated in light of its fitness
proportional to the total number of measure. This involves calculation of the objective function
individuals/solutions in the population/swarm, MVMO and analysis of constraint fulfillment, which is performed
requires only one fitness evaluation per iteration using the actual values in the problem space (i.e. de-
independently of the number of individuals saved in normalization is carried out in every single iteration). If the
the solution archive. problem is unconstrained, the fitness corresponds with the
- A single parent-offspring pair concept is adopted. objective function. But, for problems involving constraints the
- It possess a novel mapping function that is used for feasibility of the solution should be checked and the fitness
mutating genes in the offspring based on the mean and value is modified through a constraint-handling technique
variance of the solution archives dynamic population. (e.g. penalization). MVMO has the versatility to allow the
annexation of any of these techniques. Generally, it is
considered that an individual is better if the fitness is smaller.
Similarly to other heuristic optimization algorithms, the
MVMO search process can be terminated either based on a
completion of a specified number of iterations, the attainment
of a fitness threshold, or no improvement in the fitness over
the last iterations.
C. Solution archive
Although MVMO adopts a single parent-offspring pair
concept, it accounts for the performance (in terms of mean and
variance) of the n best individuals stored in the solution
archive. The archive size is fixed for the entire process. An
archive size (which is fixed for the entire process) of 2-5 is
usually sufficient. A larger archive size will result in a rather
conservative searching with orientation on the saved best
populations. The archive is filled up progressively over
iterations in a descending order of the fitness, so that the first
ranked individual is always the best found so far. Once the
archive is filled up, an update is performed only if the fitness
of the new individual is better than those in the archive. As
fitness improves over iterations, the population members keep
changing.
Fig. 1. Overview of MVMO. Mean and shape variables are calculated after every update
of the archive for each variable xi using (3) and (4),
A. Initialization stage respectively.
In MVMO, there are few settings to be preliminarily
1 n
defined: xi = xi ( j)
n j=1
(3)
- Size of the solution archive.
- Number of problem variables (dimensions), m, to be si = ln(vi ) fs (4)
selected for mutation. with the variance
- Selection method (several alternative strategies are
1 n
described in [12, 13]). vi = ( xi ( j) xi )2
n j=1
(5)
- Shape scaling factor, fs .
- Initial smoothing factor, di . At the beginning, xi corresponds with the initialized value
- Smoothing factor variation range d0 . of xi, and vi is set to one.
The real min/max bounds of every variable have to be D. Offspring generation
normalized to 0 and 1, since the search space for all At every iteration, the individual with the best fitness so far
optimization variables within MVMO is [0, 1]. Thus, the k in the archive (first position) is used to generate a new
problem variables to be optimized have to be initialized within descendant (i.e. assigned parent). Besides, m of k dimensions
these limits. This can be done by randomly sampling from the of the optimization problem are selected for mutation
space of possible solutions. Reference values for choosing the operation via mapping function while the remaining
archive size, fs , di , and d0 will be given in the subsequent dimensions inherit the corresponding values from the parent.
subsections. Alternative selection methods are described in [16, 17].
4
The new value of each selected dimension xi is given by The basic operation principle of this MVMO variant is to
xi = hx + (1 h1 + h0 ) x h0 *
(6) distribute the optimization process between different
i
processing cores that modern computers have.
where x*i is a variable varied randomly with unity distribution This variant could be viewed as a set of particles
and the term h refers to transformation mapping function, performing parallel search of the optimum through classic
which is defined as MVMO, and under certain circumstances, CMVMO allows
interchanging information held by every particle or processing
h( x, s1 , s2 , x) = x (1 e xs1 ) + (1 xi ) e (1 x )s2 (7) cores (i.e. partial best fitness, best individual and solution
hx, h1 and h0 are the outputs of the mapping function, based on archive). To avoid that the search in each particle is biased by
different inputs given by other particles, communication among them is arranged
randomly.
hx = h( x = xi* ), h0 = h( x = 0), h1 = h( x = 1) (8)
Once the information is shared, the values of the solution
Both input and output of the mapping function cover the archive of each MVMO process are updated. After
range [0, 1]. Note that the shape of the mapping function is exchanging the information, it is necessary to recalculate the
determined by the mean x and the shape factors s1 and s2 . values of each solution file to return to each individual
Recalling to (4), the factor fs can be used to change the shape MVMO process. Thus, the optimization continues until
reaching the termination criteria, in this case the criterion is a
of the function. A small value (e.g. between 0.5 and 1.0)
specified number of iterations.
allows the slope of the mapping curve to increase and thus
The CMVMO algorithm takes advantage of the full
enable better exploration, whereas values above 1.0 will result
potential of modern computers. The goal is to increase the
in a flat curve and thus lead to improved exploitation. Thus, it
search success rate by performing several interrelated and
is recommended to start the search process with a smaller fs simultaneous search processes instead of only one as in the
and then increase it as the optimization progresses [18]. case of MVMO.
Furthermore, the shapes factors s1 and s2 of the variable xi are
A. Initialization stage
assigned using the following procedure:
In CMVMO is necessary to define some additional settings
si1 = si2 = si to the MVMO:
if si > 0 then - Number of process or particles,
if si > di - Number of times that the processes are communicated
or collaborated with each other,
di = di d
- Communication factor, fC
else
The number of processes or particle can be determined
di = di /d according to the number of computing cores that modern
end if (9) computers have. The algorithm will assign to each core one
if rand() 0.5 then MVMO process. In case of having a computer cluster, the
number of processes can be increased according to the total
si1 = si ; si2 = di
number of cores available in the cluster. The frequency of
else communication between processes can be deterministically
si1 = di ; si2 = si fixed. In this paper, the total number of iterations to be
end if evaluated was divided into the number of times that the
end if processes are communicated. This criterion of communication
could also be adaptive so that early communication is low, and
The initial values of di are set for all variables at the then increase the frequency of exchange of information or
beginning of the optimization. Values around 1-5 ensure good vice versa. The communication factor enables through a
initial performance. At every iteration, each di is scaled stochastic draw, the communication among MVMO processes.
by d , which could take any value within 0 and 0.8. B. Communication among processes
Communication between processes allows the best solution
IV. COLLABORATIVE VARIANT OF MVMO (CMVMO) (e.g. fitness and individuals) found until that moment by one
process to be transferred to other search processes.
In order to improve the computational effectiveness of
In order to ensure some independence in the individual
MVMO and to fully exploit the computing potential available
search for the optimum and to avoid premature convergence to
in modern personal computers a collaborative MVMO
local optimum, the communication between processes is done
approach is presented. This version of MVMO is called
stochastic.
collaborative because each search process shares information
with other search processes from criteria established by the
programmer. The flowchart of CMVMO is given in Fig. 2.
5
400
200
Again, the robustness of MVMO and CMVMO was problem. The most salient feature of MVMO resides in the
investigated by performing 100 independent optimization unique transformation used for mutating genes in the offspring
trials per algorithm. Results are summarized in Table 2, where based on the mean variance of a dynamically stored and
it can be noticed that CMVMO has a higher success rate than updated population.
the classical MVMO. As in the previous example, the To make optimal use of modern computational resources, a
superiority of CMVMO is due to the multiple and collaborative variant of MVMO was implemented and
simultaneous interrelated search processes. presented in this paper. Basically, this variant, termed as
CMVMO, exploits multicore technology of modern computers
Table 2 Performance of MVMO and CMVMO to 46-bus System
as well as distributed computing to arrive at an enhanced
MVMO CMVMO
Max iterations 10000 10000 search performance. Hence, CMVMO performs several
Cores 1 8 simultaneous searches, which exchange information through
Time of calculus (seg) 94 240 simple predefined communication rules so that unbiased and
Test times 100 100 adverse implications are avoided.
Success rate (%) 57 100 Three test systems of different size and complexity were
Average iterations 6567 6092 employed for testing purposed. Remarkably, it was found out
Std. deviation iterations 852 852
that both, MVMO and CMVMO, performed successfully
when comparing the obtained results with those from
literature and provide reliable optimization results as well.
Furthermore, unlike MVMO, CMVMO exhibited a higher
success rate due to the enhanced exploration and exploitation
search features.
CMVMO constitutes then a sophisticated optimization tool
which could be used to solve other complex power system
optimization problems.
VII. REFERENCES
[1] E .L. Da Silva, H. A. Gil and J. M. Areiza, Transmission
Network Expansion Planning Under an Improved Genetic
Algorithm, IEEE Transactions on Power Systems, vol. 15(3),
pp. 1168-1175, Aug. 2000.
[2] R. Romero, R. A. Gallego, A. Monticelli, Transmission
System Expansion Planning By Simulated Annealing, IEEE
Transactions on Power Systems, vol. 11(1), pp. 364-369, Feb.
1996.
[3] R. A..Gallego, A. Monticelli, R. Romero, Transmission
System Expansion Planning by an Extended Genetic
Algorithm, IEEE Proceeding Generation, Transmission and
Distribution, vol. 145(3), pp. 329-335, May 1998.
[4] Maghouli, P.; Hosseini, S.H.; Buygi, M.O.; Shahidehpour, M.;
, "A Multi-Objective Framework for Transmission Expansion
Planning in Deregulated Environments," Power Systems, IEEE
Transactions on , vol.24, no.2, pp.1051-1061, May 2009.
[5] Ping Ren; Li-Qun Gao; Nan Li; Yang Li; Zhi-Ling Lin; ,
"Transmission network optimal planning using the particle
swarm optimization method," Machine Learning and
Cybernetics, 2005. Proceedings of 2005 International
Conference on, vol. 7, no., pp.4006-4011, 18-21 Aug. 2005.
[6] V. Miranda, Evolutionary Algorithms with Particle Swarm
Movements, Proceedings of the 13th International Conference
on Intelligent Systems Application to Power Systems, pp 6-21,
Fig. 7. Initial topology of 46 bus network 6-10 Nov. 2005.
[7] Torres, S.P.; Castro, C.A.; Pringles, R.M.; Guaman, W.; ,
"Comparison of particle swarm based meta-heuristics for the
VI. CONCLUSIONS
electric transmission network expansion planning problem,"
MVMO is a novel heuristic algorithm that has recently Power and Energy Society General Meeting, 2011 IEEE , vol.,
shown great promises in dealing real-world complex no., pp.1-7, 24-29 July 2011.
optimization problems. Besides its powerful search capability, [8] J. Brest, S. Greiner, B. Boskovic, M. Mernik, and V. umer,
Self-Adapting Control Parameters in Differential Evolution: A
the algorithm is also easy to be implemented. In this paper, a
Comparative Study on Numerical Benchmark Problems, IEEE
new application of MVMO to power systems is presented, Transactions on Evolutionary Computation, vol. 10, no. 6, pp
more precisely, to solve transmission system expansion 646-657 Dec. 2006.
8
[9] Sum-Im, T.; Taylor, G.A.; Irving, M.R.; Song, Y.H.; , VIII. BIOGRAPHIES
"Differential evolution algorithm for static and multistage
Rolando M. Pringles (M08) was born in
transmission expansion planning," Generation, Transmission &
1977. He received the Electrical Engineer
Distribution, IET , vol.3, no.4, pp.365-384, April 2009
[10] Ceciliano Meza, J. L.; Yildirim, M. B.; Masud, A. S. M. , A degree and the Ph.D. degree in Electrical
multiobjective evolutionary programming algorithm and its Engineering from the Universidad Nacional de
applications to power generation expansion planning, IEEE San Juan, San Juan, Argentina, in 2003 and
Transactions on Systems, Man, and Cybernetics Part A: 2011, respectively. Currently, he is a research
Systems and Humans, vol 39, no.5, pp.1086-1096, 2009 fellow at the Instituto de Energa Elctrica of
[11] Leite Da Silva, A. M.; Rezende, L. S.; Da Fonseca Manso, L. Universidad Nacional de San Juan. His
A.; & De Resende;, Reliability worth applied to transmission research interests are power system expansion,
expansion planning based on ant colony system, International methodologies for economical evaluation, power investment under
Journal of Electrical Power and Energy Systems, vol. 32, uncertainty, reliability and risk management.
no.10,pp. 1077-1084, 2010
[12] I. Erlich, G. K. Venayagamoorthy, and W. Nakawiro, A Jos L. Rueda (M07) was born in 1980. He
mean-variance optimization algorithm, in Proc. 2010 IEEE received the Electrical Engineer diploma from
Congress on Evolutionary Computation, pp.1-6, Barcelona, the Escuela Politcnica Nacional, Quito,
Spain, July 2010. Ecuador, in 2004, and the Ph.D. degree in
[13] I. Erlich, W. Nakawiro, and M. Martinez, Optimal Dispatch of electrical engineering from the Universidad
Reactive Sources in Wind Farms," in Proc. 2011 IEEE PES Nacional de San Juan, San Juan, Argentina, in
General Meeting, pp. 1-7, Detroit, USA, July 2011.
2009. From September 2003 till February
[14] W. Nakawiro, I. Erlich, and J.L. Rueda, A novel optimization
2005, he worked in Ecuador, in the fields of
algorithm for optimal reactive power dispatch: A comparative
industrial control systems and electrical
study, in Proc. 4th International Conference on Electric
Utility Deregulation and Restructuring and Power distribution networks operation and planning. Currently, he is a
Technologies, Weihai, Shandong, China, July 2011. research associate at the Institute of Electrical Power Systems,
[15] M. S. Chamba, and O. A, Despacho ptimo de energa y University of Duisburg-Essen. His current research interests include
reserva en mercados competitivos empleando algoritmos meta- power system stability and control, system identification, power
heursticos," in Proc. 2012 IEEE Argencon, Crdoba, system planning, probabilistic and artificial intelligence methods,
Argentina, June 2012. smart grids, heuristic optimization, FACTS devices and wind power.
[16] J.L. Rueda, J.C. Cepeda, and I. Erlich, "Estimation of Location
and Coordinated Tuning of PSS based on Mean-Variance
Mapping Optimization," in Proc. 2012 IEEE PES General
Meeting, San Diego, USA, July 2012.
[17] P. Chakravarty and G.K.Venayagamoorthy, "Development of
optimal controllers for a DFIG based wind farm in a smart grid
under variable wind speed conditions," in Proc. 2011 IEEE
International Electric Machines & Drives Conference, pp. 723-
728, Niagara Falls, Canada, May 2011.
[18] J.C. Cepeda, J.L. Rueda, and I. Erlich, "Identification of
Dynamic Equivalents based on Heuristic Optimization for
Smart Grid Applications" in Proc. 2012 IEEE world congress
on computational intelligence, Brisbane, Australia, June 2012.
[19] R. Romero, A. Monticelli, A. Garcia, and S. Haffner, Test
Systems and Mathematical Models for Transmission Network
Expansion Planning, IEEE Proceeding - Generation,
Transmission and Distribution, Vol. 149, pp. 27-36, Jan. 2002.
[20] R. Romero, C. Rocha, J.R.S. Mantovani, I.G. Sanchez,
Constructive heuristic algorithm for the DC model in network
transmission expansion planning, IEE Proc.-Gener. Transm.
Distrib., Vol.152, No. 2, pp. 277-282, March 2005.
[21] S. Haffner, A. Monticelli, A. Garcia, J. Mantovani, and R.
Romero, Branch and Bound Algorithm for Transmission
System Expansion Planning Using a Transportition Model,
IEE Proc. Gener, Transm, & Distrib., vol 147(3), pp. 149-156,
2000.
[22] R. D. Zimmerman, C. E. Murillo-Snchez, and R. J. Thomas,
"MATPOWER Steady-State Operations, Planning and Analysis
Tools for Power Systems Research and Education," Power
Systems, IEEE Transactions on, vol. 26, no. 1, pp. 12-19, Feb.
2011.