You are on page 1of 6

AUTO-TUNING OF CLASSICAL PID CONTROLLERS USING AN ADVANCED GENETIC ALGORITHM P. Wang and D.P.

Kwok Department of Electronic Engineering Hong Kong Polytechnic Hung Hom, Kowloon, Hong Kong

In this paper, advanced Genetic Algorithms (GAS) are utilized to automatically carry out the fine-tuning of the parameter settings of classical PID controllers. The basic concept and working principle of GAS are introduced and compared with those of traditional optimization techniques. An advanced genetic algorithm which can rapidly optimize the parameter auto-tuning process of classical PID controllers is designed based on the GAS' theory, the authors' experience and different performance indices especially including nonlinear or multiple criteria. The computer implementation of this genetic algorithm is accomplished and tested against some benchmark examples. The simulation results obtained demonstrate that this GAS-based approach is really a novel and potential method for this optimaltuning problem.
Introduction In the control literature, there exist a number of methods to fine-tune the parameter settings of classical PID controllers. Among them, the most well-known one is the Ziegler-Nichols (ZN) ultimate-cycle tuning method [l]. For a wide range of practical processes, this fine-tuning approach works quite well to produce adequate settings -of classical PID controllers. But, sometimes this method is laborious and time-consuming, particularly for processes with large time constants or delays. So, some new techniques for automatic fine-tuning or adaptive control have been developed. The relay feedback technique [2] uses a relay with hysteresis instead of the PID controller to drive the open-loop system into controlled self-oscillations, and then measures the amplitude and frequency of the self-oscillations which are used to fine-tune the PID controller. The system identification technique [3]exploits a pattern recognition method to match the output or error signal with the user's specifications and to adjust the parameter settings of the PID controller to their expected values. The cross correlation technique [4] employs a small pseudo-random-binary-sequence test signal together with the reference or setpoint signal to generate the impulse responses of the process through the computation of the correlation between the test-signal and the output of the process, and from this impulse response estimates the ultimate gain and frequency for use in the refinement of the original ZN formula. As a mater of fact, the PID parameters obtained through the above-mentioned approaches frequently need manual retuning before being transferred to the process under control. For this reason, a number of classic optimization techniques such as gradient method are often used to find optimal values. Equally, it is attractive to incorporate human knowledge for such an finetuning action. In recent years, the interest towards GAS is growing due to their intrinsical difference from ordinary optimization tools. GAS are algorithms which use operations similar to that found in natural genetics to guide their trek through a search space. Since Bagley finished his pioneering dissertation based on Holland's earlier work in 1967, GAS are becoming more and more popular as important mathematical
0-7WM582-5/92$3.0001992 IEEE
1224

means for nonlinear multi-objective optimization problems. Their applications have stretched for various fields: adaptive gameplaying, biological cell simulation, pattern recognition, machine learning, VLSI microchip layout, job-shop scheduling and so on [5]. In the automatic control field, GAS have also found their applications. Hollstien [6] firstly applied artificial genetic adaptation into computer control systems in which GAS are used to execute function optimization. Karr [8][9] suggested the use of GAS for the automatic design of fuzzy controllers in which GAS are employed to select optimal membership functions. Unfortunately, these applicational instances of GAS in control engineering seem thinly scattered in comparison with the applications in other disciplines. Therefore on this issue, more efforts should be made, especially for industrial control applications. This paper describes the application of advanced GAS to the auto-tuning of the three-term parameters for classical PID controllers. Such a simple but general approach, having ability for global optimization and with good robustness, is expected to overcome some weakness of conventional approaches and to be more acceptable for industrial practices. The remainder of the paper is organized into five sections. Section 2 introduces the basic concept and working principle of GAS. Section 3 develops some design techniques of an advanced GA for the fine-tuning of classical PID controllers. Section 4 deals with the computer implementation of the developed GA. Section 5 gives a number of numerical examples to show the efficiency and effectiveness of this novel fine-tuning approach. In the end, Section 6 draws some conclusions from the procedures proposed. Genetic Algorithms GAS are search algorithms based on the mechanics of natural selection and natural genetics. The searching process is similar to the natural evolution of biological creatures in which successive generations of organisms are given birth and raised until they themselves are able to breed. In such algorithms, the fittest among a group of artificial creatures with string structures can survive and form a new generation together with those which are produced through some structured yet randomized information or gene exchange. In every new generation, a new set of strings (offsprings) is created using bits and pieces of the fittest o the old generation according to a number of specified f performance indices. By imitating the innovative flair of human search, GAS efficiently exploit historical information to speculate on new search populations with gradually improved behaviours. Generally, GAS consist of three fundamental operators: reproduction, crossover and mutation. Given an optimization problem, simple GAS encode the parameters concerned into finite bit strings, and then run iteratively using the three operators in a random way but based on the fitness function evolution to perform the basic tasks of copying strings, exchanging portions of strings as well as changing some bits of

strings, and finally find and decode the solutions to the problem from the last pool of mature strings. Advanced GAS exploit advanced genetic operators to reinforce their efficiency and efficacy. For the purpose of developing advanced GAS, this paper selectively adopts a micro-level operator and a macro-level operator, i.e., the inversion operator and the preselection operator [2]. The former operator carries out the exchange of selected string bits so as to reorder bit positions and find adequate string orderings. The later operator uses good offsprings instead of their parents in order to maintain string diversity in the current population. These two advanced operators will help speed up the convergence of GAS by virtue of their prominent ability to generate appropriate and dissimilar offsprings. Since the overall processing of a GA involves more details, the following paragraphs will further explain its working principle: (a) Coding o Parameters: It has been becoming standard to f translate the parameters into binary bit strings. Several parameters are coded into one long string. Such strings can be lengthened to provide more resolution or shortened to provide less resolution for the representation of the parameters. @) Initial generation: It always begins by randomly generating an initial population of N strings, each of length m. The population size N is a compromising factor. Large N increases the possibility of including the solution in the first few generations but decreases the running speed of the GA. As mentioned above, the string length m determines the resolution. (c) Fitness Evaluation: In the current generation, each of the strings is decoded to be its corresponding actual parameter. Then, these parameters are sent to a judgement machine which yields a measure of the solutions quality, evaluated with some objective functions and assigned individually with fitness values. These fitness values are nothing but simply non-negative numbers assessing their relative merits. (d) Reproduction: Reproduction is a process by which the strings with larger fitness values can produce accordingly with higher probabilities large numbers of their copies in the new generation. In a policy called elitist reproduction, the current best string is guaranteed a long life from generation to generation by making one of its copy directly into the next generation. These reproduced or copied strings for possible use in the next generation are placed in a mating pool where the action of crossover and mutation may happen. (e) Crossover: Crossover is a process by which the systematic information exchange between two strings is implemented using probabilistic decisions. In a crossover process, two newly reproduced strings are chosen from the mating pool and arranged to exchange their corresponding portions of binary strings at a randomly selected partitioning position along them. This process can combine better qualities among the preferred good strings. (f) Mutation: Mutation is a process by which the chance for the GA to reach the optimal point is reinforced through just an occasional alteration of a value at a randomly selected bit position. The mutation process may quickly generate those strings which might not be conveniently produced by the previous reproduction and crossover processes. Although mutation is necessary, sometimes it may suddenly spoil the opportunity of the current appropriate generation. So, this process usually occurs with a small probability and is complementary to reproduction and crossover. (g) Inversion: Inversion is a process by which the ordering of strings is rearranged so that the GA can search for good string arrangement and at the same time search for good allele sets using other operators. In an inversion process, a string is randomly picked up from the mating pool and forced to exchange two selected bits with the bit positions randomly
1225

determined along the string. The inversion probability can not be too high for the same reaion as mutation. Inversion will be helpful to ensure multi-directional search. (h) Preselection:Preselection is a process by which an offspring possibly replaces the inferior parent if the offsprings fitness exceeds that of the inferior parent. In this way, the quality and diversity of populations are improved because strings tends to replaces strings inferior to and similar to themselves. In a preselection process, offsprings are identified and coloured so as to replace some sterile parents randomly according to the preselection probability. Preselection will be beneficial to guarantee the unable or unlucky strings die out as soon as possible. (i) Iteration: The GA runs iteratively repeating the processes (c)-(i) until it amves at a predetermined ending condition. The speed of iteration depends not only on the population size N and the string length m but also has something to do with the selection of probabilities. Of course, fast computers can speed up this process. Finally, the acceptable solution is obtained and decoded into its original pattern from the resulting binary strings. Inherently, GAS are very different from conventional optimization techniques. GAS search for a population of points, not a single point, so that they can arrive at the globally optimal point rapidly and meanwhile avoid locking at local optima. They work with a coding of the parameter sets, not the parameter themselves, so that they can get rid of the analytical limitation of search spaces. They only require objective information, not derivatives, so that they can utilize various kinds of objective functions even multiple, nonlinear or knowledge-based. They exploit probabilistic transition rules, not deterministic ones, so that they can efficiently walk to the neighbourhood of the optimal solution. In one word, the robust feature and simple structure of GAS will make them very suitable for a lot of complicated optimization problems. PID Optimal-Tuning As a mathematical means for optimization, GAS can naturally be applied to the optimal-tuning of classical PID controllers. The closed-loop system which comprises a classical PID controller and a controlled plant is illustrated in Figure 1.

Figure 1. Classical PID Control System With reference to a step. input signal, the entire system will generate an output step response. The role of the PID controller is to drive this output response within the users specifications. Obviously, the parameter settings of the PID controller should be fine-tuned so as to meet as high requirements as possible. Usually, the PID control formula can be expressed in the form of transfer function as bellow:

C(s)=K,(l + L + T $ ) Ti
where

&, Ti and

Td are the proportional gain, integral time

constant and derivative time constant respectively. Let K,=K,/T, and Kd=$Td, namely the integral gain and derivative gain. The widely-used fine-tuning technique is the ZN ultimate-circle tuning method which is given in the following formula: Kp=0.6K,,, q=O.ST,,, Td=0.125T where K, and Tu are the ultimate gain and ultimate period respectively. In order to achieve better closed-loop performances, this tuning formula is often replaced with those adaptive schemes reported in [ ] [ ] 2-4. Under some defined performance indices or cost functions, the problem to find the optimal setting of PID parameters for a classical PID controller which is being used to regulate a certain plant towards the setpoint is considered as a nonlinear optimization problem and thus can be solved using GAS. In this paper, the performance indices called time-weighted integral of squared. errors are adopted as:
J

they offer meaningful mappjng from J to F. Sharpened fitness functions like exponential ones will give rise to sensitive strings selections. The population size is designated mainly based on experience or preference, which renders a compromise between fastness and complexity. The initial generation is produced randomly. Assume that it has brought forth the following first string population:

s,: 001 10101011oO01001001010 s,: 111oO0101010100101010101 s,: 101010110110110010101110 s,: 1101011110010101111loo00


, where it is assumed that the fitness values of S I ,S , S , and S4 are respectively F,, F,, F, and F, which satisfy the condition F, > F, > F, > F,. Reproduction firstly copies the fittest string undauntedly into the mating pool and secondly reproduces others into the mating pool using a linear search like a roulette wheel with slots weighted in proportion to the fitness values. That is, SI is directly copied and S4 is reproduced with the lowest probability. In the mating pool, crossover sequentially picks up two strings and randomly decides whether or not to cross them over according to the crossover probability. Usually, the crossover probability is chosen greater than 50% so that the information among strings will be exchanged enough. Once the crossover is called for, a crossing position along the current strings is selected through random choice. For example, if S1 and S2 are fortunately copied into the mating pool and required to crossover with an exchanging site at the seventh bit from the left, the two resulting strings should be as below:

=p

V d t , (k = O,l,-;n)

(3)

where n is a non-negative integer and t, is the integration period. In addition, standard terms fike overshoot, rise time and settling time are also be selected as a compound performance index as follows:
Jc = (1+os)(crrr+csrs)

(4)

s,:

5,: 00II0I001010100101010101
where os, t, and t, stand for overshoot, rise time and settling time respectively; c, and c, are two coefficients orient for users to define or decide. The minimization of this performance index can result in as small, overshoot, short rise time and settling time as expected. By the way, the concept of Pareto optimality [2]can be applied in case that users want to make specifications on individual performance term respectively. However, in this situation, a particular decision like (4) should still be made to select a single alternative from the Pareto-optimal sets; The three PID parameters are suggested here to be transformed into a finite bit string using a common method named concatenated, mapped, and unsigned binary coding. For example, Kp, Ki and Kd all have 8 bits translations and form a binary string as:

1110001I01I axll0010010I0

. Also, in the mating pool mutation occasionally works in sequence to choose an individual string and in accordance with the mutation probabiliLy to change particular bits of the current string. For instance, if SI is selected for mutation with a mutating position at the eleventh bit from the left, the resulting new string should be of the following binary pattern:

5,:

111000110100001001001010

s: 10101011 10101oO0 11100111


I(p
Ki
Kd

where S denotes such a binary string for further use. It is supposed that Kp, Ki and K, are bounded in the closed intervals [0, KpJ,[0, KA and [0, Kh] respectively. The decimal values of their corresponding binary strings are linearly related to their range boundaries Kpmr K, and I(dm. Obviously, more bits will provide higher resolution. The fitness values F of each possible solution S to the problem are assigned real numbers according to its contributions in the performance index. So, fitness functions are selected in relation to the performance index, for instance, as follows:
1 F =F(J)= J'O

. The mutation probability is habitually designated to be quite low, sometimes equal to the reciprocal of the population size. Successively, according to the inversion probability, inversion randomly operates to choose an individual string from the mating pool and exchange the binary values of two selected bits of the current string with the bi; positions randomly determined along the string. For example, SI above is selected for inversion with two bit positions at the seventh bit to the left and the third bit to the right respectively. The inversion should result in a string as below:

5,:

111000010100001001001110

where in fact, F(J) can be any nonlinear functions, provided that


1226

. The inversion probability is normally designed not too large in order to maintain the built-up quality of the population. After these four micro-level operators have finished, according to the preselection probability, preselection as a macro-level operator intermittently functions to preselect the mating fool as a whole for the next population. It randomly uses superior offsprings to replace some inferior parents based on the fitness values. These operations proceed iteratively until the GA works out a satisfactory solution which meats the specified ending conditions. One of ending conditions may be set up in such an way that the change of the maximum fitness value remains within a tolerably

small interval even though a considerably long iteration process is observed. of There are some advantages using GAS in!tead conventional optimization techniques to fine-tune classical PID controllers. Due to their robustness, the efficiency of GAS do not rely on the characteristics of the plant under control. So, GAS are applicable to a large range of practical plants including nonlinear ones. GAS do not need analytical performance evaluation so that they can Use multiple objective measures such as mixing overshoot, rise time and settling time together as well as even knowledge-based performance indices. The informationdriven feature of GAS differ them from conventional derivativedriven searching technology and rarely ponder or be locked at the local optima. Thus, GAS, in this PID tuning case, can possibly work out the real optimal solution. ComDuter Implementation The proposed GA for optimal PID controller settings has been implemented under 3861387 computer environment. The program is written in C language and compiled by TURBO C. Random number generation is important in performing reproduction, crossover and mutation. In the program, a function called random( ) given by TURBO C is used to obtain random integers which are further scaled into number of desired range. The program consists of two parts: one is to simulate the closedloop step response of the convectional PID control systems and another is to implement the GA. Simulation is performed on every member of the generation where the GA is working on the generation as a whole. After the first generation of strings is generated randomly, the process is repeated until the generation number reaches the selected number. Step size, ranges of the PID parameters, performance indices, fitness functions and method of handling time delay are read from a text file. While parameters for the GAS, such as the number of generations, probability of crossover, mutation, inversion and preselection are selected through a menu. For time-delayed plants, Pades approximations of first to fourth orders or computer delay method are selectable. The simulation of the overall closed-loop system is accomplished using either the fourth-order Runge-Kutta numerical method or the direct timedomain computation. The GA process populations of strings, C, hence the I$, Ki, I, of conventional PID controller are represented by three 16-bits strings concatenated together to form a 48-bits string. For simplicity, unsigned binary coding is used and linearly scaled into selected ranges of the three parameters. In the program, reproduction is implemented as a linear search through a roulette wheel with slots weighted in proportion to the fitness values of the members in the old generation. Crossover operates on every pairs of members generated by reproduction. Within the crossover operation, a random number between 0 and 1 is compared to the probability of crossover to determine whether a crossover is needed. If a cross is called for, a crossing site is selected between 1 and 47 randomly. Mutation, which operates on every member of the new generation compares a random generated number between 0 and 1 to the probability of crossover to determine whether or not to alter a bit of string. Inversion runs on every member in the current generation. For the inversion operation, a random number between 0 and 1 is compared to the probability of inversion to decide whether an inversion is required. Once it is required, two inversion positions are selected between 1 and 48 randomly. Also, randomly according to the probability of preselection, preselection works to use good offsprings to replace some unable parents. Beside these basic operations, a procedure

called reserveis also implemented in the program which reserves the member of the greatest fitness value in the old generation to the new generation. The procedures for such an implementation is outlined as below: Ki < 1 > Code the three parameters K,,, and Kd into binary strings. < 2 > Produce an initial generation of strings. < 3 > Decode these strings into the three parameters K,, Ki and K. d < 4 > Calculate the overall transfer function. < 5 > Simulate the step response of the closed-loop system using fourth-order Runge-Kutta method. <6> Calculate the fitness function which is a function of the chosen performance indexes. <7> Reproduce a new generation of 48-bits strings by the roulette wheel selection. < 8 > Crossover pairs of members in the new generation; Site of crossover is generated randomly. <9> Mutate every member in the new generation according to the Drobabilitv of mutation. < 10> Invert every member. in the new generation according to the probability of inversion. < 11> Preselect the current generation according to the probability of preselection. < 12>Reserve the member of the largest fitness function in the old generation to the new generation. < 13> Do < 3 > - < 12 =iteratively until the solution is acceptable. In this program, different performance indices can be chosen, such as the integral of different time-weighted square errors. Different requirements of systems can be achieved easily by alternating the combination of performance indices or adding a few lines to change the fitness values of those members which do not fulfil the requirements. For the PC implementation, this GA program occupies around 20K memory space to stock and takes time problem-dependently from several minutes to a number of hours to run. Numerical Examples In the computer environment as stated above, a number of benchmark examples are simulated under different performance indices using the advanced GA program. The results are compared with that obtained using the ZN approach. The performance indices are adopted as J,(ISE), J,(ITSE) and J, where c,=l.O and c,=l.O. The last one is used to compress excessive overshoot and in the meanwhile to retai%- short rise time and settling time. The fitness function (5) is employed for the GA. The following four examples are selected for the simulations. Example 1 The first plant [lo] is a fourth-order linear system with a transfer function:

PI@) l/[( +s)( 1 + ~ / 6 ) ~ ] . = 1

(6)

Relatively speakmg, the linear system with high order is a bit difficult to regulate. ExamDle 2
1227

The second plant [lo] is a second-order system with small time delay. Its transfer function is:

References J. G. Ziegler and N. B. Nichols, "Optimum settings for automatic controllers", Trans. ASME, vo1.65, 1942, pp.433-444. C. C. Hang, K. J. Astrom and W. K. Ho, "Refinements of the Ziegler-Nichols tuning formula", IEE Proc. P r D, at vo1.138,no.2,March, 199l,pp.ll1-118. K. J. Astrom and T. Hagglund, "Automatic tuning of simple regulators with specifications on phase and amplitude margins",Automatica, v01.20,1984, pp.@.5-651. C. C. Hang, T. H. Lee and T. T. Tay, "The use of recursive parameter estimation as an auto-tuning aid", Proc. ISA Annual Conf., USA, 1984, pp.387-396. C. C. Hang, C. C. Lim, and K. K. Sin, "On-line autotuning of PID controllers based on cross correlation", Proc. International Conference on Industrial Electronics, Singapore, 1988, pp.44 1-446. D. E. Goldberg, "Some applications of genetic algorithms", Genetic Algorithms in Search, Optimization, and Machine Learning, Addison-Wesley Publishing Company, Inc., Reading, 1989, pp.90-145. R. B. Hollstien, "Artificial genetic adaptation in computer control systems", Ph..D. Thesis, University of Michigan, 1971. C. Karr, "Genetic algorithms for fuzzy controllers", AI Expert, February, 1991, pp.26-33. C. Karr, "Applying genetics", AI Expert, March, 1991, pp.38-43. M. Zhuang, and D. P. Atherton, "Tuning PID controllers with integral performance criteria", Proc. IEE International Conf. Control'91, Edinburgh, UK, 1991, pp.481-486. Table 1. PID values for Example 1 uning\PID 4.605 5.904 6.576 2.826 2.907 0.847 3.201 1.072 1.216

The time delay is approximated using the fourth-order Pade's formula. Example 3 The third plant [2] is a third-order linear system but with non-minimum phase. Its transfer function is:

P3(s)=(1-l.k)/(l

+). s3

(8)

The non-minimum phase feature makes the system more difficult to control, especially during the initial stage of transient. Example 4 The fourth plant [lo] is a second-order system with large time delay. Its transfer function is:

P4(s)= 0.33e -18"5"/( 134s2+ 18.5s+l).

(9)

The large time delay makesethe system tough to deal with. This is substantially a multi-phases system. The time delay for this plant is realised using computer delay method. For these four examples, the simulation results are shown with the curves of step responses in Figure 1 - Figure 4 and with the PID settings in Table 1 - Table 4. By comparing the results obtained through the ZN and the GA based on ISE, it can be noticed from the figures that the GA consistently behaved better than the ZN, i.e.,its ISE less than that of the ZN. From Table 1, Table 2 and Table 4, it can be recognized that the GA amved at almost the same PID values as in [lo] where the QuasiNewton optimization technique was used. Furthermore, by using the mixed performance index which compounds overshoot, rise time and settling time, the GA successfully suppressed down the undesired overshoot and attained short rise time as well as satisfactory settling time. In this case, it should be noted that the performance index J, worked more effectively in Example 1 and 2 duo to their structure simplicity. Conclusions This paper has suggested a novel fine-tuning technique for classical PID controllers based on advanced Genetic Algorithms. The design, implementation and testing of such GAS are discussed in detail and compared with those based on traditional optimization methods. The simulation results are obtained which demonstrate the efficiency and effectiveness of the proposed tuning technique. Further improvements may be made by including more advanced GAS operators. The GAS-based finetuning approaches have many advantages such as good robustness, simple mechanics, global optimization, bounded random exploration, general information-driven property, large group searching, mixed multi-objective criteria as well as intrinsical similarity to natural world. One day, in the authors' opinion, GAS may become the more favourite optimization technique in control engineering.

GA(J,)

3.356

lr

Table 2. PID values for Example 2

uning\PID I ZN GA(Jo) GA(J4) GA(JJ

Kp
1.677 1.444 1.366 1.066

I
I

Ki 2.043 2.265 1.733 1.232

0.330 0.579 0.305 0.012

Table 3. PID values for Example 3

0.785 0.884 0.900

0.334 0.334

0.953 0.727 0.748

1228

1.4

output

Tuning\PID ZN 11 GA(W I GA(J4)

WJJ

I<p 2.901 2.612 2.513 2.359

Ki 0.080 0.139 0.125 0.112

Kd

26.43 38.23 25.99 27.25

11

0
0

' 20

' 40

'
60

"

80

"

'
140 100

"

1.00 120

WO 200 220 240

"

Time
-ZN output 1.0,
...... QA(JO)

..

QA(J4)

QA(Jc)

Figure 5. Closed-loop step responses of Example 4

0.6

1.6

2.5

9.6

4.0

6.6

Time

-ZN

....... QA(J0)

.. QA(J4) ..

QA(JcI

Figure 2. Closed-loop step responses of Example 1

1.4

output
I

- .

Figure 3. Closed-loop step responses of Example 2

output

0.0
0.0

........................

..................................

0.4 0.2
-0.2 -0.4

, ,

'
6
1

.....
i '
2

0 . 0

"

"
3

"
4
6

'

'
0

1
10

Time

-ZN

....... GA(?O)

.--GA(J4)

QA(Jo)

Figure 4. Closed-loop step responses of Example 3


1229

You might also like