You are on page 1of 46

DESIGN OPTIMIZATION

CONSTRAINED
MINIMIZATION III
Functions of N Variables
Ranjith Dissanayake
Structures Laboratory
Dept. of Civil of Engineering
Faculty of Engineering
University of Peradeniya

VR&D

CONSTRAINED MINIMIZATION
Find the Set of Design Variables that will
Minimize F(X)
Subject to;

Objective Function

g j(X ) 0

j 1, M

Inequality Constraints

hk ( X ) 0

k 1, L

Equality Constraints

X iL X i X iU

VR&D

i 1, N

Side Constraints

Kuhn-Tucker Conditions
F ( X *)

Is Feasible

j g j ( X *) 0
*

F ( X )

j 0,

VR&D

jg j (X
j 1

M L

k h( X * ) 0

k M 1

j 1, M

Unrestricted in Sign, k=M+1, M+L


3

EXAMPLE
A Simple Cantilevered Beam
P = 2250 NT

L = 500 cm

CROSS
SECTION

H
B

VR&D

Problem Statement
Find B and H to Minimize V = BHL
Subject to;

Mc
700
I

PL3

2.54
3EI

H
12
B

1.0 B H
20.0 H 50

VR&D

Design Space
H

H/B = 12

60
V = 15,000

V = 10,000

V = 20,000

55

50

H = 50
OPTIMUM

45

40

35

VR&D

V = 5,000

= 700
B

Sequential Linear Programming


(SLP)
Algorithm
Linearize the Objective and Constraints
Solve the Linear Approximation Using the Simplex
or Other Good Algorithm
Iterate Until Convergence to an Optimum

Linearization
( X ) F ( X 0 ) F ( X 0 )T X
F
g j ( X ) g j ( X 0 ) g j ( X 0 )T X

j 1, M

Move Limits are Essential


VR&D

One Stage of the Process


H

60

55

50
TRUE OPTIMUM

X0

APPROXIMATE OPTIMUM

45

40

35

VR&D

Why Move Limits are Needed


X2

MOVE LIMITS

APPROXIMATE OPTIMUM
WITH MOVE LIMITS

X0
LINEARIZED F

F = CONSTANT

TRUE
OPTIMUM

g=0

LINEARIZED g
X1

VR&D

Sequential Linear Programming


Features
Easy to Program
Move Limits Must be Reduced as the Optimization
Progresses to Insure Solution for Those Cases Where
There are Fewer Active Constraints Than Design
Variables at the Optimum (Under Constrained)
SLP is Not Considered to be a Good Method by the
Theoreticians
Experience has Shown that SLP is Powerful and
Reliable if Coded with Care
Good Method for Parallel Processor Applications
Does Not use a One-Dimensional Search

VR&D

10

10

The Method of Feasible Directions


Originally Developed by Zoutendijk in 1960
Contained in the CONMIN and ADS Programs
Has the Feature that it will Rapidly Find a Near
Optimum Design
Used for Inequality Constrained Problems Only

VR&D

11

11

Optimization Process
1. Begin with an Initial Candidate Design, X0. Set
the Iteration Counter, q = 0
2. Call the Analysis to Evaluate F(X) and
gj(X), j=1, M
3. Set q = q + 1. Call the Sensitivity Analysis to
Evaluate F ( X ) and g j ( X ), j J where J is the
set of Active and Violated Constraints

gj (X) is active if gj(X) > CT (Typically CT = -0.05)

gj (X) is Violated if gj(X) > CTMIN


(Typically CTMIN = 0.001)

VR&D

12

12

Optimization Process Cont.


4. Calculate the Search Direction, Sq
5. Perform the One-Dimensional Search In
Direction Sq

This will Require Several Analyses

6. Check for Convergence to the Optimum. If


Satisfied, Terminate. Else go to Step 3

VR&D

13

13

Optimization Process Flow


INPUT X0
q=0
CALCULATE F(X)
AND g j(X), j = 1, M

ANALYSIS

IDENTIFY ACTIVE
AND VIOLATED
CONSTRAINTS

SENSITIVITY
ANALYSIS

CALCULATE
SEARCH DIRECTION, Sq

PERFORM THE
ONE-DIMENSIONAL
SEARCH

q =q+1

ANALYSIS

X q X q 1 S q
CHECK FOR CONVERGENCE
TO THE OPTIMUM

NO

VR&D

SATISFIED?

YES

EXIT

14

14

Active Constraint Strategy

Constraint gj(X) is Considered Active if


gj(X) > CT

Initially, CT = -0.05 to Trap Almost Active


Constraints

CT is Reduced During the Optimization Until CT = -CTMIN

Constraint gj(X) is Considered Violated if


gj(X) > CTMIN

VR&D

Usually, CTMIN = 0.001

15

15

Active Constraint Strategy


X2

FEASIBLE
g j ( X ) CT

g j ( X ) CT
INFEASIBLE

g j ( X ) CTMIN

g j (X ) 0
g j ( X ) CTMIN
X1

VR&D

16

16

Gradient (Sensitivity) Calculations

By First Forward Finite Difference


F ( X X1 ) F ( X )

X1

F
(
X

X
2)

F
(
X
)

X2
F ( X )

F
(
X

X
)

F
(
X
N

XN

Central Difference Gradients are More Reliable,


but Twice as Expensive to Calculate
If Analytic Gradients are Available, They
Should Always be Used

VR&D

17

17

Calculating Search Direction, Sq

If No Constraints are Active or Violated

If q = 1 use Steepest Descent Direction


S q F ( X q 1 )

If q > 1 Use Fletcher-Reeves Conjugate Direction


S q F ( X q 1 ) S q 1

where

F ( X q 1 )
F ( X

q2

2
2

Restart with Steepest Descent Every N


Iterations or When Progress is Slow

VR&D

18

18

Calculating Search Direction, Sq

If There are Active Constraints

Solve a Sub-Problem to Find the Components of Sq


and Value of that will
Maximize
Subject to;

F ( X q 1 ) S q 0
g j ( X q 1 ) S q j 0

S
q

Sq 1

Sq is Usable

jJ

Sq is Feasible
Sq is Bounded

Where J is the Set of Active Constraints and j is


Called the Push-Off Factor
VR&D
19

19

The Push-Off Factor j

As a Constraint Just Becomes Active, Allow the


Search to Follow the Constraint
As the Constraint Becomes More Active or
Violated, Push Harder
For gj(X) > CT

j 1

g j (X

q 1

CT

Thus, j is a Quadratic Function of the


Constraint Value

VR&D

20

20

Calculating Search Direction, Sq

Note that
F T S F S cos

And
g Tj S g j S cos

Where is the Angle Between the Two Vectors

Therefore, for S to be Both Usable and Feasible,


Must be Between 90O and 270O

Solving for Sq is a Sub-Optimization Task

VR&D

Details are Beyond the Scope of This Discussion


21

21

The Effect of j on Sq
X2

S
g1
S 1

F = CONSTANT
g2=0

S 0
g1 = 0
X1

VR&D

22

22

Calculating Search Direction, Sq

If There are Violated Constraints

Solve a Sub-Problem to Find the Components of Sq


and Value of that will
T
Minimize F ( X ) S
Sq is Usable
Subject to;

g j ( X q 1 ) S q 0

S
q

VR&D

Sq 2 1

jJ

Sq is Feasible
Sq is Bounded

Where J is the Set of Active Constraints, j is the


Push-Off Factor and is a Large Positive Number
23

23

Search Direction at Different Points


in the Design Space
X2

F
F(X) = Constant

F
Sq

g j

FEASIBLE

Sq

OPTIMUM

Sq
INFEASIBLE

gj 0

g j

VR&D

24

X1

24

The Search Process


H

H/B = 12

60
V = 15,000

V = 10,000

V = 20,000

55

50

H = 50

S2
X0
45

S1
40

35

VR&D

V = 5,000

= 700
B

25

25

Modified Method of Feasible


Directions

Very Similar to the Method of Feasible


Directions
Also Very Similar to the Generalized Reduced
Gradient Method

Does not Push Away From Active Constraints

Follows Curved Constraints

This Method is Used by the DOT Optimizer


from VR&D

VR&D

26

26

Calculating Search Direction, Sq

If There are Active Constraints

Solve a Sub-Problem to Find the Components of Sq


that will
q 1 T q
Minimize F ( X ) S
Sq is Usable
Subject to;

g j ( X q 1 ) S q 0

S
q

VR&D

Sq 1

jJ

Sq is Feasible
Sq is Bounded

Where J is the Set of Active Constraints


27

27

Calculating Search Direction, Sq

If There are Violated Constraints

Solve a Sub-Problem to Find the Components of Sq


and Value of that will
T
Minimize F ( X ) S
Sq is Usable
Subject to;

g j ( X q 1 ) S q 0

S
q

VR&D

Sq 2 1

jJ

Sq is Feasible
Sq is Bounded

Where J is the Set of Active Constraints, j is the


Push-Off Factor and is a Large Positive Number
28

28

Search Direction at Different Points


in the Design Space
X2

F
F(X) = Constant

F
Sq

g j

FEASIBLE

Sq

OPTIMUM

Sq
INFEASIBLE

gj 0

g j

VR&D

29

X1

29

The One-Dimensional Search

Following Curved (Nonlinear) Constraints

1S
X1

2 1 S

VR&D

X2

g=0

30

30

The One-Dimensional Search

Following Curved (Nonlinear) Constraints


Move Parallel to the Constraint Gradient Back
to the Constraint Boundary

X
X
Minimize

Subject to;
g j ( X S ) g j ( X )T X 0

VR&D

This is a Simple Sub-Problem

31

31

The Search Process


H

H/B = 12

60
V = 15,000

V = 10,000

V = 20,000

55

50

H = 50

S3

X0

45

S2
S1

40

VR&D

35

V = 5,000

= 700
B

32

32

Modified Method of Feasible


Directions

Features

Rapidly Finds a Near Optimum Design


Deals With Equality Constraints by Using Two Equal
and Opposite Inequality Constraints
Usually Obtains More Precise Constraint Satisfaction
than the Method of Feasible Directions

VR&D

Due to the Constant Newton-Raphson Iterations Back to the


Constraint Boundaries

Widely Used in the DOT Optimizer

33

33

Sequential Quadratic Programming


(SQP)

Basic Concept

VR&D

Create a Quadratic Approximation to the


Lagrangian
Create Linear Approximations to the Constraints
Solve the Quadratic Problem for the Search
Direction, S
Perform the One-Dimensional Search with Penalty
Functions to Avoid Constraint Violations

34

34

The Search Direction, Sq

Minimize
Subject to;

1 T
Q( S ) F ( X ) F ( X ) S S BS
2
T

g j ( X )T S g j ( X ) 0 j 1, M

hk ( X )T S hk ( X ) 0 k 1, L

Where, Typically, = 0.9 if the Constraint is


Violated and 1.0 Otherwise
is Used to Overcome Constraint Violations

VR&D

35

35

The One-Dimensional Search

Minimize the Exterior Penalty Function


P( X S ) F ( X S )

j 1

j Max 0, g ( X S ) R

k 1

k M h( X S )

Where j are the Lagrange Multipliers from the


Quadratic Sub-Problem and R is a Large
Positive Constant

VR&D

36

36

The Hessian Matrix, B

Initially set B to the Identity Matrix, I


Update B Using the BFGS Algorithm
B New B

BppT B
T

p Bp

T
pT

Where

p X q X q 1

y (1 ) Bp

y x P ( X q , q ) x P( X q 1 , q 1)

1.0

VR&D

0.8 pT Bp
pT Bp pT y

If pT y 0.2 pT Bp
If pT y 0.2 pT Bp

37

37

The Algorithm
1. Initialize B = I
2. Calculate Gradients of the Objective and all
Constraints
3. Solve the Quadratic Programming Sub-Problem
4. Calculate the Lagrange Multipliers
5. Search Using the Exterior Penalty Function
6. Update the B Matrix
7. Check for Convergence. If Satisfied, Exit. Else
Repeat from Step 2
VR&D

38

38

The Search Process


H

H/B = 12

60
V = 15,000

V = 10,000

V = 20,000

55

50

H = 50

S2
S1

45

40

35

VR&D

X0

V = 5,000

= 700
B

39

39

Sequential Quadratic Programming

Features

Strong Theoretical Basis in the Kuhn-Tucker


Conditions
Considered Best by the Theoreticians
May cut off the Feasible Region

VR&D

Modifications Required

Several Modifications Have Been Made to Improve


Reliability for Engineering Problems

40

40

Testing For Convergence

Termination Criteria

Maximum Number of Iterations, ITMAX

Any Iterative Process Must Have this Test

Satisfaction of the Kuhn-Tucker Conditions

No Usable-Feasible Search Direction can be Found

Diminishing Returns

Absolute Change in the Objective for ITRM Iterations

Relative Change in the Objective for ITRM Iterations

F ( X q ) F ( X q 1 ) DABOBJ
F ( X q ) F ( X q 1 )
F(X

VR&D

q 1

DELOBJ

No Feasible Solution can be Found


41

41

Example

Ten Variable Tapered Beam


P = 50,000 NT
1

L = 500 cm

E = 200 GPa

< 14,000 Nt/cm2


< 2.54 cm

VR&D

CROSS
SECTION

H
B

42

42

Optimization Results

Method
1.
2.
3.
4.
5.

VR&D

Augmented Lagrange Multiplier Method (ALM)


Sequential Linear Programming (SLP)
Method of Feasible Directions (MFD)
Modified Method of Feasible Directions (MMFD)
Sequential Quadratic Programming (SQP)

43

43

Optimization Results
Method

Optimum

Iterations

Function
Evaluations

65,678

533

65,398

10

110

65,411

11

140

65,399

11

170

65,410

106

VR&D

44

44

Black Box Optimization

Useful for > 90% of Everyday Design Tasks


Approach Used by VisualDOC

Read Input and Write Output From/To ASCII Files


Use VisualScript to Couple Your Code with
VisualDOC
Identify the Design Variables, Objective and
Constraints
Perform Design Study

Gradient or Non-Gradient Based Optimization


Response Surface Optimization
Design of Experiments

Post-Process to Study Design Changes


VR&D
45

45

Summary of General Optimization

Black Box Optimization is Useful for Many


Everyday Design Tasks
No Special Knowledge is Needed to use Modern
Optimization Tools

Some Theoretical Understanding Helps to Make


Effective Use of Optimization

The Optimum Found is Only as Reliable as the


Design Criteria and Analysis
Numerical Optimization is the Most Powerful
Design Assistance Tool Available Today

VR&D

46

46

You might also like