Professional Documents
Culture Documents
Linear programming:
Concept Feasible point method Geometry of LP Defining the problem The Simplex method Network problem
10/11/2013
Non-Linear Optimisation:
Linear equality constraints Linear inequality constraints Nonlinear constraints
Mixed-integer optimisation
Assessment
optimization
10/11/2013
Lecture 1
Dr. Franjo Cecelja
Motivating example Decision making Curve fitting Optimality & Optimisation Convexity Optimisation Algorithms
10/11/2013
Motivating example Decision making Curve fitting Optimality & Optimisation Convexity Optimisation Algorithms
Motivating Example
optimization
Decision Making vs Optimisation: Weakly resources allocation problem: The company produces and sells three types of computers: Alpha (), Beta () and Gamma ().
Computer type Alpha Beta Gamma Profit ($) per unit 350 470 610 Assembly time (hrs) 10 15 20 Assembly line A A B
Testing of each computer - 1 hrs included into the assembly time, Line A availability ( and computers assembled) - 120 hrs/week Line B availability ( computers assembled) - 48 hrs/week Production constrain: 2000 hrs/week of labour for assembling all computers. The company wishes to allocate these resources to maximise profit from all computers (assuming all produced computers are sold).
10/11/2013
Motivating Example
optimization
Decision Making: Decide how many of each computer (, and ) assemble every week within limits of resources to assure maximum profit (managers decision)! Optimisation: Mathematically describe the problem (model the problem), find the solution (solve the problem) and use it for making decision.
1.
10 + 15 120 20 48 10 + 15 + 20 2000
Motivating Example
optimization
4.
Formulate mathematically:
Calculate the solution: Using manual approach (graphically) or using GAMS or similar solver the maximum profit of z = $5,664 is obtained for assembling 12 Alpha () computers and 2 Gamma ( ) computers every week.
10/11/2013
Motivating example Decision making Curve fitting Optimality & Optimisation Convexity Optimisation Algorithms
a process of choosing among alternatives for the purpose of attaining a goal or goals. Turban at all What do managers do? Planning: What? When? How? Where? By whom? What do designers do? Decision process:
Intelligence; Design; Choice; Implementation.
10/11/2013
Reality
Simplification Assumptions
Problem, occurs because of dissatisfaction with the current situation define the problem + problem decomposition Finding or developing and analysing possible course of action. Critical act of decision making: solving the model is not the same as solving the problem. Multiple goal analysis. The most difficult: introduction of change which must be managed towards user expectation.
Design Phase
Validation of the model SUCCESS Formulate a model Criteria of choice Alternatives Predict and measure outcomes Alteratives
Choice Phase
Verification, testing of Proposed solution Solution of the model Sensitivity analysis Selection of the best (good) Plan for implementation Solution Implementation of solution FAILURE
10/11/2013
Motivating example Decision making Fundamentals of Optimisation: Curve fitting Optimality & Optimisation Convexity Optimisation Algorithms
is the process of constructing a curve, or mathematical function, that has the best fit to a series of data points, possibly subject to constraints.
f(x
10/11/2013
Quadratic fitting:
b(t ) = 21 + 15t 2t 2
optimization
b(
t)
-2 1
15 t2t 2
10/11/2013
1 2 4 x1 1 1 3 9 x2 = 6 1 5 25 x 4 3
For a general case:
Least square method: residual error for square function defined as:
The least square method is then defined as minimal square of residual error:
10
10/11/2013
The quadratic fitting: the most commonly used fitting method to represent continuous (polynomial) fitting:
1 1 1
f(x
Motivating example Decision making Fundamentals of Optimisation: Curve fitting Optimality & Optimisation Convexity Optimisation Algorithms
11
10/11/2013
g i ( x) = 0, i g i ( x) 0, i
the point(s) that satisfies these constraints is/are feasible point(s). Example: Is the point 2 feasible for the following constraints: . . 2 3 0 0 Answer: The point 2 is not a feasible point because the first constrain 2 2 3 0 is not satisfied. The point 1.5, however, is the only feasible point because: . . 2 1.5 3 0 1.5 0
The set S of all feasible points is feasible region or feasible set For a constraint g i ( x) 0 the point x for which g i ( x ) = 0 is binding or active and for which g i ( x ) > 0 is non-binding or non-active.
x3
s.t. g1 ( x) = x1 + 2 x2 + 3 x3 6 g 2 ( x) = x1 0 g 3 ( x ) = x2 0 g 4 ( x) = x3 0
x1
x2
12
10/11/2013
optimization
Example
optimization
Plot the feasible region for a simple one-dimensional optimisation case. From the plot find the optimal solution. . . 3 2 4 0
13
10/11/2013
Optimisation refers to process of finding the optimal point of from among the feasible points (f(x) is called objective function):
Minimisation: . . ximise . . 0 0 or . .
0+ 10 x+ x
2
Maximisation:
M in im isa
tio n
pr
) f(x
ob
le m
x*
isa tio ro np em bl
-f( x) = -1 00 x
ax
im
x2
14
10/11/2013
If there is no global minimiser, local minimiser should exist: Definition 3: x* minimises as a local minimiser if , , such with being a small positive number defining locality and which may depend on Definition 4: x* minimises as a strict local minimiser if , , , such
x*
Local minimisers could be identified using the 1st and 2nd derivatives.
Definition 5: The first derivative of a function 0 only shows the stationary point, that is the point in which the derivative takes value 0.
Example: Function 0.2 2.2 0.1 2 1 has the first derivative 0.8 6.6 0.2 2 with stationary points 0.52, 0.58 and 8.1.
15
10/11/2013
50 0
-2 0 2 4 6 8
optimization
10
Summary
optimization
Feasible point: Feasible region Global minimiser Strict global minimiser Local minimiser Strict local minimiser
a point which satisfies all constraints / 0 a set of all feasible points x a point , a point , , a point , , such a point , , , such
16
10/11/2013
Motivating example Decision making Fundamentals of Optimisation: Curve fitting Optimality & Optimisation Convexity Optimisation Algorithms
Convexity (1)
optimization
A convex function f(x) is a continuous function whose value at the midpoint of every interval in its domain does not exceed the arithmetic mean of its values at the ends of the interval:
f(x)
x1
x x2 Concave function
17
10/11/2013
Convexity (2)
optimization
which means that if x and y are in S, the line connecting them must be in S.
y
y
x
x + (1- )y
Convexity (2)
optimization
which means that the line segments that connects points (x, f(x)) and (y, f(y)) lies on or above the graph of the function.
18
10/11/2013
1 1 , 0 1, ,
Convexity (3)
optimization
Example: Test a one variable function for convexity : For example at x = -2 and y = 4 and with = 0.5:
Convexity (4)
optimization
minimise f ( x) xS
where S is convex set and f(x) is convex function of S. Also
minimise f ( x) s.t. g i ( x) 0, i = 1, L, m
is a convex programming problem if f(x) is convex and gi(x) are concave. If x* is a local minimiser of a convex programming problem, then x* is also a global minimiser. If the objective function is strictly convex, then x* is the unique local minimiser.
19
10/11/2013
Convexity (5)
optimization
Convexity & derivatives: if one-dimensional function f(x) has two continuous derivatives, then it is convex iff 0, Example: Is the function convex? It is convex on entire real line because 12 0, .
Convexity (6)
optimization
Strictly convex: function is strictly convex iff: 0, Example: Is function 0.2 2.2 0.1 2 1 convex? Solution: The two derivatives are:
Not convex!!
20
10/11/2013
Convexity (7)
optimization
Convexity & derivatives: for a multi-dimensional function to be convex the Hessian matrix of second derivatives must be positive semidefinite, that is 0, ,
Example: The quadratic function , 4 12 9 is convex on entire real space because 8 12 8 24 18 12 18 2 2 3 0, ,
Strict convexity: for a multi-dimensional function to be strictly convex the Hessian matrix of second derivatives must be positive and definite.
Positive-definite Matrices
optimization
A practical definition: matrix A is positive-definite if all eigenvalues are positive. Eigenvalues: a vector x which is mapped into x ( is a scalar) by ( ) is called eigenvector: 0 which has nontrivial solution iff 0 Example: For 1 1 2 we have 4 1 1 2 0 4 3
5 6 0 2,
21
10/11/2013
Summary
optimization
Convex function Concave function Convex set Convex function over convex set Convexity & derivatives Strict convexity & derivative
.. a set which satisfies 1 , 0 1 a function which satisfies 1 1 , 0 1, , a function is convex iff 0, or 0, , a function is strictly convex iff 0, or 0, ,
Motivating example Decision making Fundamentals of Optimisation: Curve fitting Optimality & Optimisation Convexity Optimisation Algorithms
22
10/11/2013
Optimisation Algorithm I
optimization
searches for the points (among feasible points) until optimal one has been found. Characteristics:
1. 2.
Example: minimize f(x) the optimality is usually based on condition that & ( x ) 0 then xk is & ( x) = 0 . If f f k & ( x) does not not optimal but f indicate the direction of decrease.
optimization
xk xk+1
xk+2
-40 -40 -30 -20 -10 0 x1 10 20 30 40
23
10/11/2013
optimization
Optimisation Algorithm II
optimization
improved version of optimisation algorithm I: the next point is improved estimate of the solution.
Characteristics:
pk is a search direction (hopefully pointing towards solution) and k is a step length that determines point xk+1.
24
10/11/2013
optimization
40 30
k
xk xk+1 xk+2
-30
-20
-10
0 x1
10
20
30
40
For an unconstrained optimisation, search direction is typically descend for the function f at xk. Hence, small step along pk is guaranteed to decrease:
minimise f ( xk + pk )
0
Calculating k is called line search search along the line xk + k pk
25
10/11/2013
optimization
Conclusion
optimization
Motivating example: optimisation of manufacturing computers in a company Decision making: process of using optimisation to determine the best out of all choices Curve fitting: important part of optimisation defining the function between known values Optimality & Optimisation: optimal solution is always in the feasible region the best solution in that region. Convexity: convex sets and convex function. Knowing convexity helps in determining optimal solution. Optimisation Algorithms: two basic algorithms on which all known processes are built.
26