You are on page 1of 18

Numerical Optimization

Lecture 1 - Introduction
Sangkyun Lee
TU Dortmund

Lecture 1 - Introduction

Syllabus
Study numerical optimization methods for solving
mathematical programs. Well focus on:

Motivation
Theory
Algorithm
Convergence / Complexity

Our goal: provide knowledge to recognize efficient ways


to find solutions for a given problem.
Use or modify optimization tools
Design new algorithms or improve existing ones

Numerical Optimization

Lecture 1 - Introduction

Syllabus

Language: English!
Targeting master-level students not in mathematics:
necessary mathematical concepts will be covered.

Numerical Optimization

Lecture 1 - Introduction

Syllabus

Ubung:
half will be used for lectures, another half for
exercises. So attendance is strongly recommended.
Homeworks: every two weeks ( 6)
Qualification: homeworks (40%) + two quizzes (60%)
60% will be allowed to take the final exam.
Final exam: Aug 20th.
OH: Thursday 13:00 - 14:00. LS8, JvF 23, Raum 147.

Numerical Optimization

Lecture 1 - Introduction

Textbooks
Recommended: Numerical Optimization, J. Nocedal and
S. Wright, 2nd Ed, Springer, 2006
Suggested:
Nonlinear Programming, D. P. Bertsekas, 2nd Ed., Athena
Scientific, 1999
Introductory Lectures on Convex Optimization, Y.
Nesterov, Springer, 2003(4)
Convex Optimization, S. Boyd L. Vandenberghe,
Cambridge, 2004 (pdf available)

Numerical Optimization

Lecture 1 - Introduction

Websites
My website:
www-ai.cs.uni-dortmund.de/PERSONAL/lee.html
Lecture: www-ai.cs.uni-dortmund.de/LEHRE/
VORLESUNGEN/NOPT/SS14/index.html
Lecture schedule
Lecture notes
Homeworks

Numerical Optimization

Lecture 1 - Introduction

Introduction

Numerical Optimization

Lecture 1 - Introduction

Fields of Optimization
Mathematical Programming: algorithms to solve mathematical
programs
Continuous: linear programming (LP) convex
optimization non-linear programming (NLP)
Discrete: integer programming (IP)
Operations Research: formulations of real-world problems
(typically discrete) into mathematical programs, strong
relaxations
Our focus: NLP, convex optimization

Numerical Optimization

Lecture 1 - Introduction

Mathematical Formulations of Optimization


Problems
min f (x)
subject to ci (x) 0, i I
ci (x) = 0, i E

Optimization variables x = (x1 , x2 , . . . , xn )T


Objective function: f : Rn R
Constraints: ci : Rn R
I = E = : unconstrained optimization
Otherwise: constrained optimization
Numerical Optimization

Lecture 1 - Introduction

Constraint Set / Optimal Solution


min f (x)
subject to ci (x) 0, i I
ci (x) = 0, i E
Constraint set:
X := {x Rn : ci (x) 0 i I, ci (x) = 0 i E}
Optimal value: the smallest value of f , denoted by f :
f = min f (x)
xX

Optimal solution (minimizer): x that achieves the


optimal value.
x argmin f (x),
.

Numerical Optimization

f = f (x )

xX
9

Lecture 1 - Introduction

Differnces to Evolutionary Optimization


Numerical Optimization:
Access to problem information: function values,
gradients, and Hessian of objective/constraint functions
Optimality: KKT conditions characterizes optimal
solutions
Convergence/Precision: explicit statement of convergence
to obtain a solution
Evolutionary Optimization:
Acess to problem information: function values only
Optimality: Pareto-optimal front is constructed by
feasible solutions
Convergence/Precision: explicit statements are available
except for few simple cases.
Numerical Optimization

10

Lecture 1 - Introduction

Examples
Data Analysis / Machine Learning
Variables: model parameters/coefficients
Constraints: min/max coefficient values, prior knowledge
Objective: measure of incorrect predictions
Portfolio Optimization
Variables: amounts of investment in assets
Constraints: budget, min/max investment per asset
Objective: overall risk

Numerical Optimization

11

Lecture 1 - Introduction

Example: Linear Regression (1)


Observations: data points (x i , y i ), x i Rn , y i R,
i = 1, 2, . . . , N.
Goal: find f (x) = x T that minimizes square error of
prediction, (y f (x))2
N

1X i
(y (x i )T )2
minn
R 2
i=1


(x 1 )T
y1
(x 2 )T
y2


Using A = .. RNn and b = .. RN , this can
.
.
N T
(x )
yN
be written as
1
argmin kb Ak22
n
2
R
Numerical Optimization

12

Lecture 1 - Introduction

Example: Linear Regression (2)

argmin
Rn

1
kb Ak22
2

This problem is often called the least squares


A closed-form (analytic) solution exists:
= (AT A)1 AT b
Typically, analytic solutions do not exist for general
optimization problems.

Numerical Optimization

13

Lecture 1 - Introduction

Feasibility

min f (x) s.t. x X ,


where
X := {x Rn : ci (x) 0 i I, ci (x) = 0 i E}
A vector x Rn is called
Feasible: if x X
Infeasible: if x
/X

Numerical Optimization

14

Lecture 1 - Introduction

Min/Max Equivalence

min f (x) s.t. x X


is equivalent to solve
max f (x) s.t. x X .
That is, both have the same x as a solution.
However, if f = minxX f (x), then f = maxxX f (x).

Numerical Optimization

15

Lecture 1 - Introduction

Convex Optimization
min f (x) s.t. x X R n ,
This becomes a convex optimization problem, when f is
convex AND X is convex.
A set X is convex if for any two points x X and y X ,
x + (1 )y X for all [0, 1].
A function f : X R is convex, if for any two points
x X and y X ,
f (x + (1 )y ) f (x) + (1 )f (y ), [0, 1]
A function f : X R is concave, if f is convex.
Efficient optimization algorithms exist for convex optimization
Numerical Optimization

16

Lecture 1 - Introduction

Examples
Convex Sets:
Unit ball: {x Rn : kxk2 1}
Polyhedron: an intersection of halfspaces, defined by
linear equalities and inequalities
{x Rn : Ax = b, Cx d}
Convex Functions:
Affine functions f (x) = c T x + a for c Rn and a R
Convex quadratic function f (x) = x T Hx, H Rnn is
symmetric positive semi-definite (psd in short)

Numerical Optimization

17

You might also like