Conjugate Gradient Conjugate Gradient Methods for Large-scale Methods for Large-scale Unconstrained Unconstrained Optimization Optimization Dr. Neculai Andrei Dr. Neculai Andrei Research Institute for Informatics Research Institute for Informatics Bucharest Bucharest and and Academy of Romanian Scientists Academy of Romanian Scientists , Constantza - Romania,
76
Embed
Conjugate Gradient Methods for Large-scale Unconstrained Optimization Dr. Neculai Andrei Research Institute for Informatics Bucharestand Academy of Romanian.
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Conjugate Gradient Methods for Conjugate Gradient Methods for Large-scale Unconstrained Large-scale Unconstrained
OptimizationOptimization
Dr. Neculai AndreiDr. Neculai AndreiResearch Institute for InformaticsResearch Institute for Informatics
BucharestBucharestandand
Academy of Romanian ScientistsAcademy of Romanian Scientists
Ovidius University, Constantza - Romania, March 27, 2008
Proposition Assume that is a descent direction and
satisfies the Lipschitz condition
for all on the line segment connecting and
where is a positive constant.
If the line search satisfies the Wolfe conditions, then
If the line search satisfies the Goldstein conditions then
Remarks:
2) In conjugate gradient methods the step lengths may differ from 1 in a very unpredictable manner. They can be larger or smaller than 1 depending on how the problem is scaled*.
1) The Newton method, the quasi-Newton or the limited memory quasi-Newton methods has the ability to accept unity step lengths along the iterations.
*N. Andrei, (2007) Acceleration of conjugate gradient algorithms for unconstrained optimization.(submitted JOTA)
Methods for Unconstrained Optimization
1) Steepest descent (Cauchy, 1847)
( )k kd f x
2 1( ) ( )k k kd f x f x
2) Newton
3) Quasi-Newton (Broyden, 1965; and many others)
( )k k kd H f x 2 1( )k kH f x
4) Conjugate Gradient Methods (1952)
1 1k k k kd g s
1k k ks x x
2 1( )k k kd f x g
k is known as the conjugate gradient parameter
5) Truncated Newton method (Dembo, et al, 1982)
2 ( )k k k kr f x d g
6) Trust Region methods
7) Conic model method (Davidon, 1980)
2
1( ) ( )
1 2 (1 )
T Tk k
k T T
g d d A dc d f x
b d b d
1( ) ( )
2T T
k k kq d f x g d d B d
8) Metode tensoriale (Schnabel & Frank, 1984)
2 21( ) ( ) ( ) ( )
2T c c c cm x d f x f x d f x d
3 41 1
6 24c cT d V d
10) Direct searching methods
9) Methods based on systems of Differential Equations Gradient flow Method (Courant, 1942)
2 1d( ) ( )
d
xf x f x
t
0(0)x x
Hooke-Jevees (form searching) (1961)Powell (conjugate directions) (1964)Rosenbrock (coordinate system rotation)(1960)Nelder-Mead (rolling the simplex) (1965)Powell –UOBYQA (quadratic approximation) (1994-2000)
N. Andrei, Critica Retiunii Algoritmilor de Optimizare fara RestrictiiEditura Academiei Romane, 2008.
Conjugate Gradient Methods
1k k k kx x d
1 1k k k kd g s
1k k ks x x 1k k ky g g
Magnus Hestenes (1906-1991) Eduard Stiefel (1909-1978)
The prototype of Conjugate Gradient Algorithm
Step 1. Select the initial starting point: 0x dom f
Step 2. Test a criterion for stopping the iterations. kg
Step 3. Determine the steplength k by Wolfe conditions.
Step 4. Update the variables: 1k k k kx x d
Step 5. Compute: k
Step 6. Compute the search direction: 1 1k k k kd g s
Step 7. Restart. If:2
1 10.2Tk k kg g g then set 1 1k kd g
Step 8. Compute the initial guess: 1 1 / ,k k k kd d set
* N. Andrei, A Dai-Yuan conjugate gradient algorithm with sufficient descent and conjugacy conditions for unconstrained optimization. Applied Mathematics Letters, vol.21, 2008, pp.165-171.
16. Convex combination of PRP and DY from conjugacy condition (CCOMB - Andrei)
(1 ) ,CCOMB PRP DYk k k k k
1 12 2
1 1
( )( ) ( )( )
( )( )
T T T TCCOMB k k k k k k k k
k k T Tk k k k k k
y g y s y g g g
y g y s g g
If 0,CCOMBk then .CCOMB PRP
k k
If 1,CCOMBk then .CCOMB DY
k k
N. Andrei, New hybrid conjugate gradient algorithms for unconstrained optimization. Encyclopedia of Optimization, 2nd Edition, Springer, August 2008, Entry 761.
17. Convex combination of PRP and DY from Newton direction (NDOMB - Andrei)
(1 ) ,NDOMB PRP DYk k k k k
2
1 1 12 2
1 1
( ) ( )( )
( )( )
T T T Tk k k k k k k k kNDOMB
k k T Tk k k k k k
y g s g g g y y s
g g g y y s
If 0,NDOMBk then .NDOMB PRP
k k
If 1,NDOMBk then .NDOMB DY
k k
N. Andrei, New hybrid conjugate gradient algorithms as a convex combination of PRP and DY for unconstrained optimization. ICI Technical Report, October 1, 2007. (submitted AML)
18. Convex combination of HS and DY from Newton direction (HYBRID - Andrei)
(1 )HYBRID HS DYk k k k k
(Secant condition)1
1
Tk k
k Tk k
s g
g g
If 0,k then .HYBRID PRPk k
If 1,k then .HYBRID DYk k
N. Andrei, A hybrid conjugate gradient algorithm for unconstrained optimization as a convex combination of Hestenes-Stiefel and Dai-Yuan.Studies in Informatics and Control, vol.17, No.1, March 2008, pp.55-70.
19. Convex combination of HS and DY from Newton direction with modified secant condition (HYBRIDM - Andrei)
N. Andrei, A hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization. ICI Technical Report, February 6, 2008(submitted to Numerical Algorithms)
The direction 1kd is computed using a double update scheme:
for 1k r
ANDREI, N., A scaled nonlinear conjugate gradient algorithm for unconstrained optimization. Optimization. A journal of mathematical programming and operations research, DOI, accepted.
In conjugate gradient methods the step lengths may differ from 1 in a very unpredictable manner. They can be larger or smaller than 1 depending on how the problem is scaled.
1k k k k kx x d
22 21( ) ( ) ( )
2T T
k k k k k k k k k k k k kf x d f x g d d f x d o d
22 2 21( ) ( ) ( )
2T T
k k k k k k k k k k k k kf x d f x g d d f x d o d
( ) ( ) ( )k k k k k k kf x d f x d
General Theory of Acceleration
N. Andrei, (2007) Acceleration of conjugate gradient algorithms for unconstrained optimization. ICI Technical Report, October 24, 2007. (submitted JOTA, 2007)
N. Andrei, Accelerated conjugate gradient algorithm with modified secant condition for unconstrained optimization. ICI Technical Report, March 3, 2008.(Submitted: Applied Mathematics and Optimization, 2008)