Professional Documents
Culture Documents
TABLE 2.2
Gradient-Based Optimization Methods
Steepest Descent Conjugate Gradient
S = −∇f (x )
0 0
S 0 = −∇f (x 0 )
Solve
∇T f (x 0 )S 0
(S 0 )T ∇f (x 0 + α 0 S 0 ) α0 = −
(S 0 )T H ( x 0 )S 0
k +1 k +1
S k +1 = −∇f (x k +1 ) S k +1 = −∇f (x k +1 ) + S k ∇ f (x )∇f (x )
T
∇ f (x )∇f (x k )
T k
Solve
∇T f (x k +1 )S k +1
(S k +1 )T ∇f (x k +1 + α k +1S k +1 ) α k +1 = −
k +1 T
(S ) H ( x k +1 )S k +1
Newtonʼs method
S 0 = −[ H (x 0 )]−1 ∇f (x 0 )
α0 = 1
−1
( )
S k +1 = − H x k +1 ∇f x k +1
( )
α k +1 = 1
optimize Z = z( x )
s.t. (2.19)
h (x ) = 0
Here,the main concern is to obtain an optimal solution for z ( x ), which also
complies with the set of equality constraints. Two strategies to ensure that
are presented here: the method of Lagrange multipliers and the generalized
reduced gradient method.