Research Article A New Conjugate Gradient Algorithm with …downloads.hindawi.com/journals/mpe/2015/352524.pdf · 2019-07-31 · Research Article A New Conjugate Gradient Algorithm
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Research ArticleA New Conjugate Gradient Algorithm with Sufficient DescentProperty for Unconstrained Optimization
XiaoPing Wu1 LiYing Liu2 FengJie Xie1 and YongFei Li1
1School of Economic Management Xirsquoan University of Posts and Telecommunications Shaanxi Xirsquoan 710061 China2School of Mathematics Science Liaocheng University Shandong Liaocheng 252000 China
Correspondence should be addressed to XiaoPing Wu wuxiaoping1978126com
Received 16 May 2015 Revised 24 September 2015 Accepted 29 September 2015
Academic Editor Masoud Hajarian
Copyright copy 2015 XiaoPing Wu et al This is an open access article distributed under the Creative Commons Attribution Licensewhich permits unrestricted use distribution and reproduction in any medium provided the original work is properly cited
A new nonlinear conjugate gradient formula which satisfies the sufficient descent condition for solving unconstrainedoptimization problem is proposed The global convergence of the algorithm is established under weak Wolfe line search Somenumerical experiments show that this new WWPNPRP+ algorithm is competitive to the SWPPRP+ algorithm the SWPHS+algorithm and the WWPDYHS+ algorithm
1 Introduction
In this paper we consider the following unconstrained opti-mization problem
min 119891 (119909) | 119909 isinR119899 (1)
where 119891(119909) R119899 rarr R is a twice continuously differentiablefunction whose gradient is denoted by 119892(119909) R119899 rarr R119899 Itsiterative formula is given by
Hindawi Publishing CorporationMathematical Problems in EngineeringVolume 2015 Article ID 352524 8 pageshttpdxdoiorg1011552015352524
2 Mathematical Problems in Engineering
where 120575 isin (0 12) and 120590 isin (120575 1) Wolfe-Powell is referred toas Wolfe
Considerable attentions have been made on the globalconvergence behaviors for the above methods Zoutendijk[1] proved that the FR method with exact line search isglobally convergent Al-Baali [2] extended this result to thestrong Wolfe line search conditions In [3] Dai and Yuan
proposed the DY method which produces a descent searchdirection at every iteration and converges globally providedthat the line search satisfies the weak Wolfe conditions In[4] Wei et al discussed the global convergence of the PRPconjugate gradient method (CGM) with inexact line searchfor nonconvex unconstrained optimization Recently basedon [5ndash7] Jiang et al [8] proposed a hybrid CGM with
is often used to analyze the global convergence of thenonlinear conjugate gradient method with the inexact linesearch techniques For instance Touati-Ahmed and Storey[9] Al-Baali [2] Gilbert and Nocedal [10] and Hu andStorey [11] hinted that the sufficient descent condition maybe crucial for conjugate gradientmethods Unfortunately thiscondition is hard to hold It has been showed that the PRPmethod with the strong Wolfe Powell line search does notensure this condition at each iteration So Grippo and Lucidi[12] managed to find some line searches which ensure thesufficient descent condition and they presented a new linesearch which ensures this condition The convergence of thePRP method with this line search had been established Yuet al [13] analyzed the global convergence of modified PRPCGM with sufficient descent property Gilbert and Nocedal[10] gave another way to discuss the global convergence ofthe PRP method with the weak Wolfe line search By using acomplicated line search they were able to establish the globalconvergence result of the PRP and HSmethods by restrictingthe parameter 120573
119896in (3) not allowed to be negative that is
120573+119896= max 0 120573PRP
119896 (10)
which yields a globally convergent CG method being alsocomputationally efficient [14] In spite of the numericalefficiency of the PRP method as an important defect themethod lacks the following descent property
119892119879119896119889119896le 0 forall119896 ge 0 (11)
even for uniformly convex objective functions [15] Thismotivated the researchers to pay much attention to findingsome extensions of the PRPmethodwith descent property Inthis context Yu et al [16] proposed a modified form of 120573PRP
with a constant 119862 ge 14 leading to a CG method withthe sufficient descent property Dai and Kou [17] proposea family of conjugate gradient methods and an improvedWolfe line search meanwhile to accelerate the algorithman adaptive restart along negative gradients method is intro-duced Jiang and Jian [18] proposed twomodified CGMswithdisturbance factors based on a variant of PRP method thetwo proposed methods not only generate sufficient descentdirection at each iteration but also converge globally fornonconvex minimization if the strong Wolfe line search isused A newhybrid conjugate gradientmethodwas presentedfor unconstrained optimization The proposed method cangenerate decent directions at every iteration moreover thisproperty is independent of the steplength line search Underthe Wolfe line search the proposed method possesses globalconvergence [19]
The main purpose of this paper is to design an efficientalgorithm which possesses the properties of global conver-gence sufficient descent and good numerical results In nextsection we present a new CG formula and give its propertiesIn Section 3 the new algorithm and its global convergenceresult will be established To test and compare the numericalperformance of the proposed method in the last part of thiswork a large amount of medium-scale numerical experi-ments are reported by tables and performance profiles
2 The Formula and Its Property
Because sufficient descent condition (9) is a very nice andimportant property to analyze the global convergence of theCG methods we hope to find 120573
119896such that 119889
119896satisfies (9) In
the following we propose a sequence 120573119896 and prove that it
has such property Firstly we give a definition of a descentsequence (or a sufficient descent sequence) a sequence 120573
119896
is called a descent sequence (or a sufficient descent sequence)for the CG methods if there exists a constant 120591 isin (0 1) (or120591 isin [0 1)) such that for all 119896 ge 2
is any given positive constant It is easy to prove that 120573VFR119896 is
a descent sequence (with 120591 = 12058311205832) for CGmethds if 119892119879
119896119889119896le
0 Formula (16) possesses the sufficient descent property andproved that there exist some nonlinear conjugate gradientformulae possessing the sufficient descent property withoutany line searches where
if 100381710038171003817100381711989211989610038171003817100381710038172
ge10038161003816100381610038161003816119892119879
119896119889119896minus1
10038161003816100381610038161003816
0 Otherwise
(18)
in which 120583 ge 1Motivated by the ideas in [20 22] without any line
search and sufficient descent and taking into account thegood convergence properties of [10] and the good numericalperformance in [14] we propose a class new formula about120573119896as follows
sufficient descent condition (9) hold for all 119896 ge 1
By the proof of Proposition 2 we can know that theformula 120583
11205832lt 1 is necessary otherwise the sufficient
descent condition can not be held
4 Mathematical Problems in Engineering
3 Global Convergence
In this section we propose an algorithm related to 120573NPRP+
119896
and then we study the global convergence property of thisalgorithm Firstly we make the following two assumptionswhich have been widely used in the literature to analyze theglobal convergence of the CG methods with the inexact linesearches
Step 2 Let 119909119896+1= 119909119896+ 119905119896119889119896and 119892
119896+1= 119892(119909
119896+1) If 119892
119896+1 = 0
then stop Otherwise go to Step 3
Step 3 Compute 120573119873119875119877119875+
119896+1by formula (19) and (20) Then
generate 119889119896+1
by (3)
Step 4 Set 119896 = 119896 + 1 go to Step 0
Since 119891(119909119896) is decreasing sequence it is clear that the
sequence 119909119896 is contained in Ω and there exists a constant
119891lowast such that
lim119896rarrinfin
119891 (119909119896) = 119891lowast (27)
By using Assumptions A and B we can deduce that thereexists119872 gt 0 such that
10038171003817100381710038171198921198961003817100381710038171003817 le 119872 forall119909 isin Ω (28)
The following important result was obtained by Zoutendijk[1] and Wolfe [23 24]
Lemma 4 Suppose 119891(119909) is bounded below and 119892(119909) satisfiesthe Lipschitz condition Consider any iteration method offormula (2) where 119889
which contradicts to Zoutendijk condition (29) This showsthat (32) holds The proof of the theorem is complete
From the proof of the above theorem we can concludethat any conjugate gradient method with the formula 120573NPRP
+
119896
and some certain step size technique which ensures thatZoutendijk condition (29) holds is globally convergent Inparticular the formula 120573NPRP
+
119896with the weak Wolfe condi-
tions can generate a globally convergent result
4 Numerical Results
All methods above are tested on 56 test problems wherethe former test problems 1ndash48 (from arwhead to woods) inTable 1 are taken from the CUTE library in Bongartz et al[26] and the others are taken fromMore et al [27]
DYHS 120573119896= max 0min 120573HS
119896 120573DY119896 (43)
is generated by Grippo and Lucidi [12]All codes were written in Matlab 75 and run on a HP
with 187GB RAM and Windows XP operating system Theparameters are 120590 = 01 120575 = 001 119906
1= 1 119906
2= 2 and 120582 = 03
Stop the iteration if criterion 119892119896 le 120598 = 10minus6 is satisfied
In Table 1 ldquoNamerdquo denotes the abbreviation of the testproblems ldquo119899rdquo denotes the dimension of the test problemsldquoItrNFNGrdquo denote the number of iteration function eval-uations and gradient evaluations respectively and ldquoTcpurdquodenotes the computing time of CPU for computing a testproblem (units second)
On the other hand to show the performance differenceclearly between the hJHJ hAN hDY and hHuS method weadopt the performance profiles given by Dolan and More[28] to compare the performance according to Itr NF NGand Tcpu respectively Benchmark results are generated byrunning a solver on a set P of problems and recordinginformation of interest such as NF and Tcpu Let S be theset of solvers in comparison Assume that S consists of 119899
119904
0 2 4 6 8 100
02
04
06
08
1
120591
120588(120591)
SWPHS+
SWPPRP+WWPNPRP+
WWPDYHS
Figure 1 Performance profiles on Tpu
0 2 4 6 8 10 120
02
04
06
08
1
120591
120588(120591)
SWPHS+
SWPPRP+WWPNPRP+
WWPDYHS
Figure 2 Performance profile on NF
solvers and P consists of 119899119901problems For each problem
119901 isin P and solver 119904 isin S denote 119905119901119904
by the computing time(or the number of function evaluation etc) required to solveproblem 119901 isin P by solver 119904 isin S and the comparison betweendifferent solvers is based on the performance ratio defined by
119903119901119904=
119905119901119904
min 119905119901119904 119904 isin S
(44)
Assume that a large enough parameter 119903119872ge 119903119901119904
for all119901 119904 is chosen and 119903
119901119904= 119903119872
if and only if solvers 119904 do notsolve problem 119901 Define
is within a factor 120591 isin 119877119899 The 120588119904is the (cumulative)
distribution function for the performance ratio The value of120588119904(1) is the probability that the solver will win over the rest of
the solversBased on the theory of the performance profile above
four performance figures that is Figures 1ndash4 can be gener-ated according to Table 1 From the four figures we can seethat the NPRP is superior to the other three CGMs on thetesting problems
5 Conclusion
In this paper we carefully studied the combination of thevariations of the formulas 120573FR
119896and 120573PRP
119896 We have found
that the new formula possesses the following features (1)120573NPRP
+
119896is a descent sequence without any line search (2) the
new method possesses the sufficient descent property and
converge globally (3) the strategy will restart the iterationautomatically along the steepest descent direction if a neg-ative value of 120573NPRP
119896occurs (4) the initial numerical results
are promising
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
The authors are very grateful to the anonymous referees fortheir useful suggestions and comments which improved thequality of this paper This work is supported by the NaturalScience Foundation of Shanxi Province (2012JQ9004) NewStar Team of Xirsquoan University of Post and Telecommucica-tions (XY201506) Science Foundation of Liaocheng Univer-sity (318011303) and General Project of the National SocialScience Foundation (15BGL014)
References
[1] G Zoutendijk ldquoNonlinear programming computational meth-odsrdquo in Integer and Non-Linear Programming J Abadie Ed pp37ndash86 North-Holland Publishing Amsterdam The Nerther-lands 1970
[2] M Al-Baali ldquoDescent property and global convergence of thefletcher-reeves method with inexact line searchrdquo IMA Journalof Numerical Analysis vol 5 no 1 pp 121ndash124 1985
[3] Y H Dai and Y Yuan ldquoAn efficient hybrid conjugate gradientmethod for unconstrained optimizationrdquo Annals of OperationsResearch vol 103 no 1ndash4 pp 33ndash47 2001
[4] Z X Wei G Y Li and L Q Qi ldquoGlobal convergence ofthe Polak-Ribiere-Polyak conjugate gradient method with anArmijo-type inexact line search for nonconvex unconstrainedoptimization problemsrdquo Mathematics of Computation vol 77no 264 pp 2173ndash2193 2008
[5] Y H Dai and Y Yuan ldquoA nonlinear conjugate gradient methodwith a strong global convergence propertyrdquo SIAM Journal onOptimization vol 10 no 1 pp 177ndash182 2000
[6] S W Yao Z XWei and H Huang ldquoA note aboutWYLrsquos conju-gate gradientmethod and its applicationsrdquoAppliedMathematicsand Computation vol 191 no 2 pp 381ndash388 2007
[7] X Z Jiang G D Ma and J B Jian ldquoA new global convergentconjugate gradient method with Wolfe line searchrdquo ChineseJournal of Engineering Mathematics vol 28 no 6 pp 779ndash7862011
[8] X Z Jiang L Han and J B Jian ldquoA globally convergentmixed conjugate gradient method with Wolfe line searchrdquoMathematica Numerica Sinica vol 34 no 1 pp 103ndash112 2012
[9] D Touati-Ahmed and C Storey ldquoEfficient hybrid conjugategradient techniquesrdquo Journal of OptimizationTheory and Appli-cations vol 64 no 2 pp 379ndash397 1990
[10] J C Gilbert and J Nocedal ldquoGlobal convergence properties ofconjugate gradient methods for optimizationrdquo SIAM Journal onOptimization vol 2 no 1 pp 21ndash42 1992
[11] Y F Hu and C Storey ldquoGlobal convergence result for conjugategradient methodsrdquo Journal of OptimizationTheory and Applica-tions vol 71 no 2 pp 399ndash405 1991
8 Mathematical Problems in Engineering
[12] L Grippo and S Lucidi ldquoA globally convergent version ofthe polak-ribiere conjugate gradient methodrdquo MathematicalProgramming Series B vol 78 no 3 pp 375ndash391 1997
[13] G H Yu L T Guan and G Y Li ldquoGlobal convergenceof modified Polak-Ribiere-Polyak conjugate gradient methodswith sufficient descent propertyrdquo Journal of Industrial andManagement Optimization vol 4 no 3 pp 565ndash579 2008
[14] N Andrei ldquoNumerical comparison of conjugate gradient algo-rithms for unconstrained optimizationrdquo Studies in Informaticsamp Control vol 16 no 4 pp 333ndash352 2007
[15] Y H Dai Analyses of conjugate gradient methods [PhD thesis]Mathematics and ScientificEngineering Computing ChineseAcademy of Sciences 1997
[16] G Yu L Guan and G Li ldquoGlobal convergence of modifiedPolak-Ribiere-Polyak conjugate gradient methods with suffi-cient descent propertyrdquo Journal of Industrial and ManagementOptimization vol 4 no 3 pp 565ndash579 2008
[17] Y-H Dai and C-X Kou ldquoA nonlinear conjugate gradientalgorithmwith an optimal property and an improved wolfe linesearchrdquo SIAM Journal on Optimization vol 23 no 1 pp 296ndash320 2013
[18] X-Z Jiang and J-B Jian ldquoTwo modified nonlinear conjugategradient methods with disturbance factors for unconstrainedoptimizationrdquoNonlinear Dynamics vol 77 no 1-2 pp 387ndash3972014
[19] J B Jian L Han and X Z Jiang ldquoA hybrid conjugate gradientmethodwith descent property for unconstrained optimizationrdquoApplied Mathematical Modelling vol 39 pp 1281ndash1290 2015
[20] Z Wei G Li and L Qi ldquoNew nonlinear conjugate gradientformulas for large-scale unconstrained optimization problemsrdquoApplied Mathematics and Computation vol 179 no 2 pp 407ndash430 2006
[21] H Huang Z Wei and Y Shengwei ldquoThe proof of the suffi-cient descent condition of the Wei-Yao-Liu conjugate gradientmethod under the strong Wolfe-Powell line searchrdquo AppliedMathematics and Computation vol 189 no 2 pp 1241ndash12452007
[22] G Yu Y Zhao and Z Wei ldquoA descent nonlinear conjugategradient method for large-scale unconstrained optimizationrdquoApplied Mathematics and Computation vol 187 no 2 pp 636ndash643 2007
[23] P Wolfe ldquoConvergence conditions for ascent methodsrdquo SIAMReview vol 11 no 2 pp 226ndash235 1969
[24] P Wolfe ldquoConvergence conditions for ascent methods ii somecorrectionsrdquo SIAM Review vol 13 no 2 pp 185ndash188 1971
[25] Y Dai and Y YuanNonlinear Conjugate Methods Science Pressof Shanghai Shanghai China 2000
[26] I Bongartz A R Conn N Gould and P L Toint ldquoCUTEconstrained and unconstrained testing environmentrdquo ACMTransactions on Mathematical Software vol 21 no 1 pp 123ndash160 1995
[27] J J More B S Garbow and K E Hillstrom ldquoTestingunconstrained optimization softwarerdquo ACM Transactions onMathematical Software vol 7 no 1 pp 17ndash41 1981
[28] E D Dolan and J J More ldquoBenchmarking optimization soft-ware with performance profilesrdquo Mathematical ProgrammingSeries B vol 91 no 2 pp 201ndash213 2002
where 120575 isin (0 12) and 120590 isin (120575 1) Wolfe-Powell is referred toas Wolfe
Considerable attentions have been made on the globalconvergence behaviors for the above methods Zoutendijk[1] proved that the FR method with exact line search isglobally convergent Al-Baali [2] extended this result to thestrong Wolfe line search conditions In [3] Dai and Yuan
proposed the DY method which produces a descent searchdirection at every iteration and converges globally providedthat the line search satisfies the weak Wolfe conditions In[4] Wei et al discussed the global convergence of the PRPconjugate gradient method (CGM) with inexact line searchfor nonconvex unconstrained optimization Recently basedon [5ndash7] Jiang et al [8] proposed a hybrid CGM with
is often used to analyze the global convergence of thenonlinear conjugate gradient method with the inexact linesearch techniques For instance Touati-Ahmed and Storey[9] Al-Baali [2] Gilbert and Nocedal [10] and Hu andStorey [11] hinted that the sufficient descent condition maybe crucial for conjugate gradientmethods Unfortunately thiscondition is hard to hold It has been showed that the PRPmethod with the strong Wolfe Powell line search does notensure this condition at each iteration So Grippo and Lucidi[12] managed to find some line searches which ensure thesufficient descent condition and they presented a new linesearch which ensures this condition The convergence of thePRP method with this line search had been established Yuet al [13] analyzed the global convergence of modified PRPCGM with sufficient descent property Gilbert and Nocedal[10] gave another way to discuss the global convergence ofthe PRP method with the weak Wolfe line search By using acomplicated line search they were able to establish the globalconvergence result of the PRP and HSmethods by restrictingthe parameter 120573
119896in (3) not allowed to be negative that is
120573+119896= max 0 120573PRP
119896 (10)
which yields a globally convergent CG method being alsocomputationally efficient [14] In spite of the numericalefficiency of the PRP method as an important defect themethod lacks the following descent property
119892119879119896119889119896le 0 forall119896 ge 0 (11)
even for uniformly convex objective functions [15] Thismotivated the researchers to pay much attention to findingsome extensions of the PRPmethodwith descent property Inthis context Yu et al [16] proposed a modified form of 120573PRP
with a constant 119862 ge 14 leading to a CG method withthe sufficient descent property Dai and Kou [17] proposea family of conjugate gradient methods and an improvedWolfe line search meanwhile to accelerate the algorithman adaptive restart along negative gradients method is intro-duced Jiang and Jian [18] proposed twomodified CGMswithdisturbance factors based on a variant of PRP method thetwo proposed methods not only generate sufficient descentdirection at each iteration but also converge globally fornonconvex minimization if the strong Wolfe line search isused A newhybrid conjugate gradientmethodwas presentedfor unconstrained optimization The proposed method cangenerate decent directions at every iteration moreover thisproperty is independent of the steplength line search Underthe Wolfe line search the proposed method possesses globalconvergence [19]
The main purpose of this paper is to design an efficientalgorithm which possesses the properties of global conver-gence sufficient descent and good numerical results In nextsection we present a new CG formula and give its propertiesIn Section 3 the new algorithm and its global convergenceresult will be established To test and compare the numericalperformance of the proposed method in the last part of thiswork a large amount of medium-scale numerical experi-ments are reported by tables and performance profiles
2 The Formula and Its Property
Because sufficient descent condition (9) is a very nice andimportant property to analyze the global convergence of theCG methods we hope to find 120573
119896such that 119889
119896satisfies (9) In
the following we propose a sequence 120573119896 and prove that it
has such property Firstly we give a definition of a descentsequence (or a sufficient descent sequence) a sequence 120573
119896
is called a descent sequence (or a sufficient descent sequence)for the CG methods if there exists a constant 120591 isin (0 1) (or120591 isin [0 1)) such that for all 119896 ge 2
is any given positive constant It is easy to prove that 120573VFR119896 is
a descent sequence (with 120591 = 12058311205832) for CGmethds if 119892119879
119896119889119896le
0 Formula (16) possesses the sufficient descent property andproved that there exist some nonlinear conjugate gradientformulae possessing the sufficient descent property withoutany line searches where
if 100381710038171003817100381711989211989610038171003817100381710038172
ge10038161003816100381610038161003816119892119879
119896119889119896minus1
10038161003816100381610038161003816
0 Otherwise
(18)
in which 120583 ge 1Motivated by the ideas in [20 22] without any line
search and sufficient descent and taking into account thegood convergence properties of [10] and the good numericalperformance in [14] we propose a class new formula about120573119896as follows
sufficient descent condition (9) hold for all 119896 ge 1
By the proof of Proposition 2 we can know that theformula 120583
11205832lt 1 is necessary otherwise the sufficient
descent condition can not be held
4 Mathematical Problems in Engineering
3 Global Convergence
In this section we propose an algorithm related to 120573NPRP+
119896
and then we study the global convergence property of thisalgorithm Firstly we make the following two assumptionswhich have been widely used in the literature to analyze theglobal convergence of the CG methods with the inexact linesearches
Step 2 Let 119909119896+1= 119909119896+ 119905119896119889119896and 119892
119896+1= 119892(119909
119896+1) If 119892
119896+1 = 0
then stop Otherwise go to Step 3
Step 3 Compute 120573119873119875119877119875+
119896+1by formula (19) and (20) Then
generate 119889119896+1
by (3)
Step 4 Set 119896 = 119896 + 1 go to Step 0
Since 119891(119909119896) is decreasing sequence it is clear that the
sequence 119909119896 is contained in Ω and there exists a constant
119891lowast such that
lim119896rarrinfin
119891 (119909119896) = 119891lowast (27)
By using Assumptions A and B we can deduce that thereexists119872 gt 0 such that
10038171003817100381710038171198921198961003817100381710038171003817 le 119872 forall119909 isin Ω (28)
The following important result was obtained by Zoutendijk[1] and Wolfe [23 24]
Lemma 4 Suppose 119891(119909) is bounded below and 119892(119909) satisfiesthe Lipschitz condition Consider any iteration method offormula (2) where 119889
which contradicts to Zoutendijk condition (29) This showsthat (32) holds The proof of the theorem is complete
From the proof of the above theorem we can concludethat any conjugate gradient method with the formula 120573NPRP
+
119896
and some certain step size technique which ensures thatZoutendijk condition (29) holds is globally convergent Inparticular the formula 120573NPRP
+
119896with the weak Wolfe condi-
tions can generate a globally convergent result
4 Numerical Results
All methods above are tested on 56 test problems wherethe former test problems 1ndash48 (from arwhead to woods) inTable 1 are taken from the CUTE library in Bongartz et al[26] and the others are taken fromMore et al [27]
DYHS 120573119896= max 0min 120573HS
119896 120573DY119896 (43)
is generated by Grippo and Lucidi [12]All codes were written in Matlab 75 and run on a HP
with 187GB RAM and Windows XP operating system Theparameters are 120590 = 01 120575 = 001 119906
1= 1 119906
2= 2 and 120582 = 03
Stop the iteration if criterion 119892119896 le 120598 = 10minus6 is satisfied
In Table 1 ldquoNamerdquo denotes the abbreviation of the testproblems ldquo119899rdquo denotes the dimension of the test problemsldquoItrNFNGrdquo denote the number of iteration function eval-uations and gradient evaluations respectively and ldquoTcpurdquodenotes the computing time of CPU for computing a testproblem (units second)
On the other hand to show the performance differenceclearly between the hJHJ hAN hDY and hHuS method weadopt the performance profiles given by Dolan and More[28] to compare the performance according to Itr NF NGand Tcpu respectively Benchmark results are generated byrunning a solver on a set P of problems and recordinginformation of interest such as NF and Tcpu Let S be theset of solvers in comparison Assume that S consists of 119899
119904
0 2 4 6 8 100
02
04
06
08
1
120591
120588(120591)
SWPHS+
SWPPRP+WWPNPRP+
WWPDYHS
Figure 1 Performance profiles on Tpu
0 2 4 6 8 10 120
02
04
06
08
1
120591
120588(120591)
SWPHS+
SWPPRP+WWPNPRP+
WWPDYHS
Figure 2 Performance profile on NF
solvers and P consists of 119899119901problems For each problem
119901 isin P and solver 119904 isin S denote 119905119901119904
by the computing time(or the number of function evaluation etc) required to solveproblem 119901 isin P by solver 119904 isin S and the comparison betweendifferent solvers is based on the performance ratio defined by
119903119901119904=
119905119901119904
min 119905119901119904 119904 isin S
(44)
Assume that a large enough parameter 119903119872ge 119903119901119904
for all119901 119904 is chosen and 119903
119901119904= 119903119872
if and only if solvers 119904 do notsolve problem 119901 Define
is within a factor 120591 isin 119877119899 The 120588119904is the (cumulative)
distribution function for the performance ratio The value of120588119904(1) is the probability that the solver will win over the rest of
the solversBased on the theory of the performance profile above
four performance figures that is Figures 1ndash4 can be gener-ated according to Table 1 From the four figures we can seethat the NPRP is superior to the other three CGMs on thetesting problems
5 Conclusion
In this paper we carefully studied the combination of thevariations of the formulas 120573FR
119896and 120573PRP
119896 We have found
that the new formula possesses the following features (1)120573NPRP
+
119896is a descent sequence without any line search (2) the
new method possesses the sufficient descent property and
converge globally (3) the strategy will restart the iterationautomatically along the steepest descent direction if a neg-ative value of 120573NPRP
119896occurs (4) the initial numerical results
are promising
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
The authors are very grateful to the anonymous referees fortheir useful suggestions and comments which improved thequality of this paper This work is supported by the NaturalScience Foundation of Shanxi Province (2012JQ9004) NewStar Team of Xirsquoan University of Post and Telecommucica-tions (XY201506) Science Foundation of Liaocheng Univer-sity (318011303) and General Project of the National SocialScience Foundation (15BGL014)
References
[1] G Zoutendijk ldquoNonlinear programming computational meth-odsrdquo in Integer and Non-Linear Programming J Abadie Ed pp37ndash86 North-Holland Publishing Amsterdam The Nerther-lands 1970
[2] M Al-Baali ldquoDescent property and global convergence of thefletcher-reeves method with inexact line searchrdquo IMA Journalof Numerical Analysis vol 5 no 1 pp 121ndash124 1985
[3] Y H Dai and Y Yuan ldquoAn efficient hybrid conjugate gradientmethod for unconstrained optimizationrdquo Annals of OperationsResearch vol 103 no 1ndash4 pp 33ndash47 2001
[4] Z X Wei G Y Li and L Q Qi ldquoGlobal convergence ofthe Polak-Ribiere-Polyak conjugate gradient method with anArmijo-type inexact line search for nonconvex unconstrainedoptimization problemsrdquo Mathematics of Computation vol 77no 264 pp 2173ndash2193 2008
[5] Y H Dai and Y Yuan ldquoA nonlinear conjugate gradient methodwith a strong global convergence propertyrdquo SIAM Journal onOptimization vol 10 no 1 pp 177ndash182 2000
[6] S W Yao Z XWei and H Huang ldquoA note aboutWYLrsquos conju-gate gradientmethod and its applicationsrdquoAppliedMathematicsand Computation vol 191 no 2 pp 381ndash388 2007
[7] X Z Jiang G D Ma and J B Jian ldquoA new global convergentconjugate gradient method with Wolfe line searchrdquo ChineseJournal of Engineering Mathematics vol 28 no 6 pp 779ndash7862011
[8] X Z Jiang L Han and J B Jian ldquoA globally convergentmixed conjugate gradient method with Wolfe line searchrdquoMathematica Numerica Sinica vol 34 no 1 pp 103ndash112 2012
[9] D Touati-Ahmed and C Storey ldquoEfficient hybrid conjugategradient techniquesrdquo Journal of OptimizationTheory and Appli-cations vol 64 no 2 pp 379ndash397 1990
[10] J C Gilbert and J Nocedal ldquoGlobal convergence properties ofconjugate gradient methods for optimizationrdquo SIAM Journal onOptimization vol 2 no 1 pp 21ndash42 1992
[11] Y F Hu and C Storey ldquoGlobal convergence result for conjugategradient methodsrdquo Journal of OptimizationTheory and Applica-tions vol 71 no 2 pp 399ndash405 1991
8 Mathematical Problems in Engineering
[12] L Grippo and S Lucidi ldquoA globally convergent version ofthe polak-ribiere conjugate gradient methodrdquo MathematicalProgramming Series B vol 78 no 3 pp 375ndash391 1997
[13] G H Yu L T Guan and G Y Li ldquoGlobal convergenceof modified Polak-Ribiere-Polyak conjugate gradient methodswith sufficient descent propertyrdquo Journal of Industrial andManagement Optimization vol 4 no 3 pp 565ndash579 2008
[14] N Andrei ldquoNumerical comparison of conjugate gradient algo-rithms for unconstrained optimizationrdquo Studies in Informaticsamp Control vol 16 no 4 pp 333ndash352 2007
[15] Y H Dai Analyses of conjugate gradient methods [PhD thesis]Mathematics and ScientificEngineering Computing ChineseAcademy of Sciences 1997
[16] G Yu L Guan and G Li ldquoGlobal convergence of modifiedPolak-Ribiere-Polyak conjugate gradient methods with suffi-cient descent propertyrdquo Journal of Industrial and ManagementOptimization vol 4 no 3 pp 565ndash579 2008
[17] Y-H Dai and C-X Kou ldquoA nonlinear conjugate gradientalgorithmwith an optimal property and an improved wolfe linesearchrdquo SIAM Journal on Optimization vol 23 no 1 pp 296ndash320 2013
[18] X-Z Jiang and J-B Jian ldquoTwo modified nonlinear conjugategradient methods with disturbance factors for unconstrainedoptimizationrdquoNonlinear Dynamics vol 77 no 1-2 pp 387ndash3972014
[19] J B Jian L Han and X Z Jiang ldquoA hybrid conjugate gradientmethodwith descent property for unconstrained optimizationrdquoApplied Mathematical Modelling vol 39 pp 1281ndash1290 2015
[20] Z Wei G Li and L Qi ldquoNew nonlinear conjugate gradientformulas for large-scale unconstrained optimization problemsrdquoApplied Mathematics and Computation vol 179 no 2 pp 407ndash430 2006
[21] H Huang Z Wei and Y Shengwei ldquoThe proof of the suffi-cient descent condition of the Wei-Yao-Liu conjugate gradientmethod under the strong Wolfe-Powell line searchrdquo AppliedMathematics and Computation vol 189 no 2 pp 1241ndash12452007
[22] G Yu Y Zhao and Z Wei ldquoA descent nonlinear conjugategradient method for large-scale unconstrained optimizationrdquoApplied Mathematics and Computation vol 187 no 2 pp 636ndash643 2007
[23] P Wolfe ldquoConvergence conditions for ascent methodsrdquo SIAMReview vol 11 no 2 pp 226ndash235 1969
[24] P Wolfe ldquoConvergence conditions for ascent methods ii somecorrectionsrdquo SIAM Review vol 13 no 2 pp 185ndash188 1971
[25] Y Dai and Y YuanNonlinear Conjugate Methods Science Pressof Shanghai Shanghai China 2000
[26] I Bongartz A R Conn N Gould and P L Toint ldquoCUTEconstrained and unconstrained testing environmentrdquo ACMTransactions on Mathematical Software vol 21 no 1 pp 123ndash160 1995
[27] J J More B S Garbow and K E Hillstrom ldquoTestingunconstrained optimization softwarerdquo ACM Transactions onMathematical Software vol 7 no 1 pp 17ndash41 1981
[28] E D Dolan and J J More ldquoBenchmarking optimization soft-ware with performance profilesrdquo Mathematical ProgrammingSeries B vol 91 no 2 pp 201ndash213 2002
is any given positive constant It is easy to prove that 120573VFR119896 is
a descent sequence (with 120591 = 12058311205832) for CGmethds if 119892119879
119896119889119896le
0 Formula (16) possesses the sufficient descent property andproved that there exist some nonlinear conjugate gradientformulae possessing the sufficient descent property withoutany line searches where
if 100381710038171003817100381711989211989610038171003817100381710038172
ge10038161003816100381610038161003816119892119879
119896119889119896minus1
10038161003816100381610038161003816
0 Otherwise
(18)
in which 120583 ge 1Motivated by the ideas in [20 22] without any line
search and sufficient descent and taking into account thegood convergence properties of [10] and the good numericalperformance in [14] we propose a class new formula about120573119896as follows
sufficient descent condition (9) hold for all 119896 ge 1
By the proof of Proposition 2 we can know that theformula 120583
11205832lt 1 is necessary otherwise the sufficient
descent condition can not be held
4 Mathematical Problems in Engineering
3 Global Convergence
In this section we propose an algorithm related to 120573NPRP+
119896
and then we study the global convergence property of thisalgorithm Firstly we make the following two assumptionswhich have been widely used in the literature to analyze theglobal convergence of the CG methods with the inexact linesearches
Step 2 Let 119909119896+1= 119909119896+ 119905119896119889119896and 119892
119896+1= 119892(119909
119896+1) If 119892
119896+1 = 0
then stop Otherwise go to Step 3
Step 3 Compute 120573119873119875119877119875+
119896+1by formula (19) and (20) Then
generate 119889119896+1
by (3)
Step 4 Set 119896 = 119896 + 1 go to Step 0
Since 119891(119909119896) is decreasing sequence it is clear that the
sequence 119909119896 is contained in Ω and there exists a constant
119891lowast such that
lim119896rarrinfin
119891 (119909119896) = 119891lowast (27)
By using Assumptions A and B we can deduce that thereexists119872 gt 0 such that
10038171003817100381710038171198921198961003817100381710038171003817 le 119872 forall119909 isin Ω (28)
The following important result was obtained by Zoutendijk[1] and Wolfe [23 24]
Lemma 4 Suppose 119891(119909) is bounded below and 119892(119909) satisfiesthe Lipschitz condition Consider any iteration method offormula (2) where 119889
which contradicts to Zoutendijk condition (29) This showsthat (32) holds The proof of the theorem is complete
From the proof of the above theorem we can concludethat any conjugate gradient method with the formula 120573NPRP
+
119896
and some certain step size technique which ensures thatZoutendijk condition (29) holds is globally convergent Inparticular the formula 120573NPRP
+
119896with the weak Wolfe condi-
tions can generate a globally convergent result
4 Numerical Results
All methods above are tested on 56 test problems wherethe former test problems 1ndash48 (from arwhead to woods) inTable 1 are taken from the CUTE library in Bongartz et al[26] and the others are taken fromMore et al [27]
DYHS 120573119896= max 0min 120573HS
119896 120573DY119896 (43)
is generated by Grippo and Lucidi [12]All codes were written in Matlab 75 and run on a HP
with 187GB RAM and Windows XP operating system Theparameters are 120590 = 01 120575 = 001 119906
1= 1 119906
2= 2 and 120582 = 03
Stop the iteration if criterion 119892119896 le 120598 = 10minus6 is satisfied
In Table 1 ldquoNamerdquo denotes the abbreviation of the testproblems ldquo119899rdquo denotes the dimension of the test problemsldquoItrNFNGrdquo denote the number of iteration function eval-uations and gradient evaluations respectively and ldquoTcpurdquodenotes the computing time of CPU for computing a testproblem (units second)
On the other hand to show the performance differenceclearly between the hJHJ hAN hDY and hHuS method weadopt the performance profiles given by Dolan and More[28] to compare the performance according to Itr NF NGand Tcpu respectively Benchmark results are generated byrunning a solver on a set P of problems and recordinginformation of interest such as NF and Tcpu Let S be theset of solvers in comparison Assume that S consists of 119899
119904
0 2 4 6 8 100
02
04
06
08
1
120591
120588(120591)
SWPHS+
SWPPRP+WWPNPRP+
WWPDYHS
Figure 1 Performance profiles on Tpu
0 2 4 6 8 10 120
02
04
06
08
1
120591
120588(120591)
SWPHS+
SWPPRP+WWPNPRP+
WWPDYHS
Figure 2 Performance profile on NF
solvers and P consists of 119899119901problems For each problem
119901 isin P and solver 119904 isin S denote 119905119901119904
by the computing time(or the number of function evaluation etc) required to solveproblem 119901 isin P by solver 119904 isin S and the comparison betweendifferent solvers is based on the performance ratio defined by
119903119901119904=
119905119901119904
min 119905119901119904 119904 isin S
(44)
Assume that a large enough parameter 119903119872ge 119903119901119904
for all119901 119904 is chosen and 119903
119901119904= 119903119872
if and only if solvers 119904 do notsolve problem 119901 Define
is within a factor 120591 isin 119877119899 The 120588119904is the (cumulative)
distribution function for the performance ratio The value of120588119904(1) is the probability that the solver will win over the rest of
the solversBased on the theory of the performance profile above
four performance figures that is Figures 1ndash4 can be gener-ated according to Table 1 From the four figures we can seethat the NPRP is superior to the other three CGMs on thetesting problems
5 Conclusion
In this paper we carefully studied the combination of thevariations of the formulas 120573FR
119896and 120573PRP
119896 We have found
that the new formula possesses the following features (1)120573NPRP
+
119896is a descent sequence without any line search (2) the
new method possesses the sufficient descent property and
converge globally (3) the strategy will restart the iterationautomatically along the steepest descent direction if a neg-ative value of 120573NPRP
119896occurs (4) the initial numerical results
are promising
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
The authors are very grateful to the anonymous referees fortheir useful suggestions and comments which improved thequality of this paper This work is supported by the NaturalScience Foundation of Shanxi Province (2012JQ9004) NewStar Team of Xirsquoan University of Post and Telecommucica-tions (XY201506) Science Foundation of Liaocheng Univer-sity (318011303) and General Project of the National SocialScience Foundation (15BGL014)
References
[1] G Zoutendijk ldquoNonlinear programming computational meth-odsrdquo in Integer and Non-Linear Programming J Abadie Ed pp37ndash86 North-Holland Publishing Amsterdam The Nerther-lands 1970
[2] M Al-Baali ldquoDescent property and global convergence of thefletcher-reeves method with inexact line searchrdquo IMA Journalof Numerical Analysis vol 5 no 1 pp 121ndash124 1985
[3] Y H Dai and Y Yuan ldquoAn efficient hybrid conjugate gradientmethod for unconstrained optimizationrdquo Annals of OperationsResearch vol 103 no 1ndash4 pp 33ndash47 2001
[4] Z X Wei G Y Li and L Q Qi ldquoGlobal convergence ofthe Polak-Ribiere-Polyak conjugate gradient method with anArmijo-type inexact line search for nonconvex unconstrainedoptimization problemsrdquo Mathematics of Computation vol 77no 264 pp 2173ndash2193 2008
[5] Y H Dai and Y Yuan ldquoA nonlinear conjugate gradient methodwith a strong global convergence propertyrdquo SIAM Journal onOptimization vol 10 no 1 pp 177ndash182 2000
[6] S W Yao Z XWei and H Huang ldquoA note aboutWYLrsquos conju-gate gradientmethod and its applicationsrdquoAppliedMathematicsand Computation vol 191 no 2 pp 381ndash388 2007
[7] X Z Jiang G D Ma and J B Jian ldquoA new global convergentconjugate gradient method with Wolfe line searchrdquo ChineseJournal of Engineering Mathematics vol 28 no 6 pp 779ndash7862011
[8] X Z Jiang L Han and J B Jian ldquoA globally convergentmixed conjugate gradient method with Wolfe line searchrdquoMathematica Numerica Sinica vol 34 no 1 pp 103ndash112 2012
[9] D Touati-Ahmed and C Storey ldquoEfficient hybrid conjugategradient techniquesrdquo Journal of OptimizationTheory and Appli-cations vol 64 no 2 pp 379ndash397 1990
[10] J C Gilbert and J Nocedal ldquoGlobal convergence properties ofconjugate gradient methods for optimizationrdquo SIAM Journal onOptimization vol 2 no 1 pp 21ndash42 1992
[11] Y F Hu and C Storey ldquoGlobal convergence result for conjugategradient methodsrdquo Journal of OptimizationTheory and Applica-tions vol 71 no 2 pp 399ndash405 1991
8 Mathematical Problems in Engineering
[12] L Grippo and S Lucidi ldquoA globally convergent version ofthe polak-ribiere conjugate gradient methodrdquo MathematicalProgramming Series B vol 78 no 3 pp 375ndash391 1997
[13] G H Yu L T Guan and G Y Li ldquoGlobal convergenceof modified Polak-Ribiere-Polyak conjugate gradient methodswith sufficient descent propertyrdquo Journal of Industrial andManagement Optimization vol 4 no 3 pp 565ndash579 2008
[14] N Andrei ldquoNumerical comparison of conjugate gradient algo-rithms for unconstrained optimizationrdquo Studies in Informaticsamp Control vol 16 no 4 pp 333ndash352 2007
[15] Y H Dai Analyses of conjugate gradient methods [PhD thesis]Mathematics and ScientificEngineering Computing ChineseAcademy of Sciences 1997
[16] G Yu L Guan and G Li ldquoGlobal convergence of modifiedPolak-Ribiere-Polyak conjugate gradient methods with suffi-cient descent propertyrdquo Journal of Industrial and ManagementOptimization vol 4 no 3 pp 565ndash579 2008
[17] Y-H Dai and C-X Kou ldquoA nonlinear conjugate gradientalgorithmwith an optimal property and an improved wolfe linesearchrdquo SIAM Journal on Optimization vol 23 no 1 pp 296ndash320 2013
[18] X-Z Jiang and J-B Jian ldquoTwo modified nonlinear conjugategradient methods with disturbance factors for unconstrainedoptimizationrdquoNonlinear Dynamics vol 77 no 1-2 pp 387ndash3972014
[19] J B Jian L Han and X Z Jiang ldquoA hybrid conjugate gradientmethodwith descent property for unconstrained optimizationrdquoApplied Mathematical Modelling vol 39 pp 1281ndash1290 2015
[20] Z Wei G Li and L Qi ldquoNew nonlinear conjugate gradientformulas for large-scale unconstrained optimization problemsrdquoApplied Mathematics and Computation vol 179 no 2 pp 407ndash430 2006
[21] H Huang Z Wei and Y Shengwei ldquoThe proof of the suffi-cient descent condition of the Wei-Yao-Liu conjugate gradientmethod under the strong Wolfe-Powell line searchrdquo AppliedMathematics and Computation vol 189 no 2 pp 1241ndash12452007
[22] G Yu Y Zhao and Z Wei ldquoA descent nonlinear conjugategradient method for large-scale unconstrained optimizationrdquoApplied Mathematics and Computation vol 187 no 2 pp 636ndash643 2007
[23] P Wolfe ldquoConvergence conditions for ascent methodsrdquo SIAMReview vol 11 no 2 pp 226ndash235 1969
[24] P Wolfe ldquoConvergence conditions for ascent methods ii somecorrectionsrdquo SIAM Review vol 13 no 2 pp 185ndash188 1971
[25] Y Dai and Y YuanNonlinear Conjugate Methods Science Pressof Shanghai Shanghai China 2000
[26] I Bongartz A R Conn N Gould and P L Toint ldquoCUTEconstrained and unconstrained testing environmentrdquo ACMTransactions on Mathematical Software vol 21 no 1 pp 123ndash160 1995
[27] J J More B S Garbow and K E Hillstrom ldquoTestingunconstrained optimization softwarerdquo ACM Transactions onMathematical Software vol 7 no 1 pp 17ndash41 1981
[28] E D Dolan and J J More ldquoBenchmarking optimization soft-ware with performance profilesrdquo Mathematical ProgrammingSeries B vol 91 no 2 pp 201ndash213 2002
In this section we propose an algorithm related to 120573NPRP+
119896
and then we study the global convergence property of thisalgorithm Firstly we make the following two assumptionswhich have been widely used in the literature to analyze theglobal convergence of the CG methods with the inexact linesearches
Step 2 Let 119909119896+1= 119909119896+ 119905119896119889119896and 119892
119896+1= 119892(119909
119896+1) If 119892
119896+1 = 0
then stop Otherwise go to Step 3
Step 3 Compute 120573119873119875119877119875+
119896+1by formula (19) and (20) Then
generate 119889119896+1
by (3)
Step 4 Set 119896 = 119896 + 1 go to Step 0
Since 119891(119909119896) is decreasing sequence it is clear that the
sequence 119909119896 is contained in Ω and there exists a constant
119891lowast such that
lim119896rarrinfin
119891 (119909119896) = 119891lowast (27)
By using Assumptions A and B we can deduce that thereexists119872 gt 0 such that
10038171003817100381710038171198921198961003817100381710038171003817 le 119872 forall119909 isin Ω (28)
The following important result was obtained by Zoutendijk[1] and Wolfe [23 24]
Lemma 4 Suppose 119891(119909) is bounded below and 119892(119909) satisfiesthe Lipschitz condition Consider any iteration method offormula (2) where 119889
which contradicts to Zoutendijk condition (29) This showsthat (32) holds The proof of the theorem is complete
From the proof of the above theorem we can concludethat any conjugate gradient method with the formula 120573NPRP
+
119896
and some certain step size technique which ensures thatZoutendijk condition (29) holds is globally convergent Inparticular the formula 120573NPRP
+
119896with the weak Wolfe condi-
tions can generate a globally convergent result
4 Numerical Results
All methods above are tested on 56 test problems wherethe former test problems 1ndash48 (from arwhead to woods) inTable 1 are taken from the CUTE library in Bongartz et al[26] and the others are taken fromMore et al [27]
DYHS 120573119896= max 0min 120573HS
119896 120573DY119896 (43)
is generated by Grippo and Lucidi [12]All codes were written in Matlab 75 and run on a HP
with 187GB RAM and Windows XP operating system Theparameters are 120590 = 01 120575 = 001 119906
1= 1 119906
2= 2 and 120582 = 03
Stop the iteration if criterion 119892119896 le 120598 = 10minus6 is satisfied
In Table 1 ldquoNamerdquo denotes the abbreviation of the testproblems ldquo119899rdquo denotes the dimension of the test problemsldquoItrNFNGrdquo denote the number of iteration function eval-uations and gradient evaluations respectively and ldquoTcpurdquodenotes the computing time of CPU for computing a testproblem (units second)
On the other hand to show the performance differenceclearly between the hJHJ hAN hDY and hHuS method weadopt the performance profiles given by Dolan and More[28] to compare the performance according to Itr NF NGand Tcpu respectively Benchmark results are generated byrunning a solver on a set P of problems and recordinginformation of interest such as NF and Tcpu Let S be theset of solvers in comparison Assume that S consists of 119899
119904
0 2 4 6 8 100
02
04
06
08
1
120591
120588(120591)
SWPHS+
SWPPRP+WWPNPRP+
WWPDYHS
Figure 1 Performance profiles on Tpu
0 2 4 6 8 10 120
02
04
06
08
1
120591
120588(120591)
SWPHS+
SWPPRP+WWPNPRP+
WWPDYHS
Figure 2 Performance profile on NF
solvers and P consists of 119899119901problems For each problem
119901 isin P and solver 119904 isin S denote 119905119901119904
by the computing time(or the number of function evaluation etc) required to solveproblem 119901 isin P by solver 119904 isin S and the comparison betweendifferent solvers is based on the performance ratio defined by
119903119901119904=
119905119901119904
min 119905119901119904 119904 isin S
(44)
Assume that a large enough parameter 119903119872ge 119903119901119904
for all119901 119904 is chosen and 119903
119901119904= 119903119872
if and only if solvers 119904 do notsolve problem 119901 Define
is within a factor 120591 isin 119877119899 The 120588119904is the (cumulative)
distribution function for the performance ratio The value of120588119904(1) is the probability that the solver will win over the rest of
the solversBased on the theory of the performance profile above
four performance figures that is Figures 1ndash4 can be gener-ated according to Table 1 From the four figures we can seethat the NPRP is superior to the other three CGMs on thetesting problems
5 Conclusion
In this paper we carefully studied the combination of thevariations of the formulas 120573FR
119896and 120573PRP
119896 We have found
that the new formula possesses the following features (1)120573NPRP
+
119896is a descent sequence without any line search (2) the
new method possesses the sufficient descent property and
converge globally (3) the strategy will restart the iterationautomatically along the steepest descent direction if a neg-ative value of 120573NPRP
119896occurs (4) the initial numerical results
are promising
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
The authors are very grateful to the anonymous referees fortheir useful suggestions and comments which improved thequality of this paper This work is supported by the NaturalScience Foundation of Shanxi Province (2012JQ9004) NewStar Team of Xirsquoan University of Post and Telecommucica-tions (XY201506) Science Foundation of Liaocheng Univer-sity (318011303) and General Project of the National SocialScience Foundation (15BGL014)
References
[1] G Zoutendijk ldquoNonlinear programming computational meth-odsrdquo in Integer and Non-Linear Programming J Abadie Ed pp37ndash86 North-Holland Publishing Amsterdam The Nerther-lands 1970
[2] M Al-Baali ldquoDescent property and global convergence of thefletcher-reeves method with inexact line searchrdquo IMA Journalof Numerical Analysis vol 5 no 1 pp 121ndash124 1985
[3] Y H Dai and Y Yuan ldquoAn efficient hybrid conjugate gradientmethod for unconstrained optimizationrdquo Annals of OperationsResearch vol 103 no 1ndash4 pp 33ndash47 2001
[4] Z X Wei G Y Li and L Q Qi ldquoGlobal convergence ofthe Polak-Ribiere-Polyak conjugate gradient method with anArmijo-type inexact line search for nonconvex unconstrainedoptimization problemsrdquo Mathematics of Computation vol 77no 264 pp 2173ndash2193 2008
[5] Y H Dai and Y Yuan ldquoA nonlinear conjugate gradient methodwith a strong global convergence propertyrdquo SIAM Journal onOptimization vol 10 no 1 pp 177ndash182 2000
[6] S W Yao Z XWei and H Huang ldquoA note aboutWYLrsquos conju-gate gradientmethod and its applicationsrdquoAppliedMathematicsand Computation vol 191 no 2 pp 381ndash388 2007
[7] X Z Jiang G D Ma and J B Jian ldquoA new global convergentconjugate gradient method with Wolfe line searchrdquo ChineseJournal of Engineering Mathematics vol 28 no 6 pp 779ndash7862011
[8] X Z Jiang L Han and J B Jian ldquoA globally convergentmixed conjugate gradient method with Wolfe line searchrdquoMathematica Numerica Sinica vol 34 no 1 pp 103ndash112 2012
[9] D Touati-Ahmed and C Storey ldquoEfficient hybrid conjugategradient techniquesrdquo Journal of OptimizationTheory and Appli-cations vol 64 no 2 pp 379ndash397 1990
[10] J C Gilbert and J Nocedal ldquoGlobal convergence properties ofconjugate gradient methods for optimizationrdquo SIAM Journal onOptimization vol 2 no 1 pp 21ndash42 1992
[11] Y F Hu and C Storey ldquoGlobal convergence result for conjugategradient methodsrdquo Journal of OptimizationTheory and Applica-tions vol 71 no 2 pp 399ndash405 1991
8 Mathematical Problems in Engineering
[12] L Grippo and S Lucidi ldquoA globally convergent version ofthe polak-ribiere conjugate gradient methodrdquo MathematicalProgramming Series B vol 78 no 3 pp 375ndash391 1997
[13] G H Yu L T Guan and G Y Li ldquoGlobal convergenceof modified Polak-Ribiere-Polyak conjugate gradient methodswith sufficient descent propertyrdquo Journal of Industrial andManagement Optimization vol 4 no 3 pp 565ndash579 2008
[14] N Andrei ldquoNumerical comparison of conjugate gradient algo-rithms for unconstrained optimizationrdquo Studies in Informaticsamp Control vol 16 no 4 pp 333ndash352 2007
[15] Y H Dai Analyses of conjugate gradient methods [PhD thesis]Mathematics and ScientificEngineering Computing ChineseAcademy of Sciences 1997
[16] G Yu L Guan and G Li ldquoGlobal convergence of modifiedPolak-Ribiere-Polyak conjugate gradient methods with suffi-cient descent propertyrdquo Journal of Industrial and ManagementOptimization vol 4 no 3 pp 565ndash579 2008
[17] Y-H Dai and C-X Kou ldquoA nonlinear conjugate gradientalgorithmwith an optimal property and an improved wolfe linesearchrdquo SIAM Journal on Optimization vol 23 no 1 pp 296ndash320 2013
[18] X-Z Jiang and J-B Jian ldquoTwo modified nonlinear conjugategradient methods with disturbance factors for unconstrainedoptimizationrdquoNonlinear Dynamics vol 77 no 1-2 pp 387ndash3972014
[19] J B Jian L Han and X Z Jiang ldquoA hybrid conjugate gradientmethodwith descent property for unconstrained optimizationrdquoApplied Mathematical Modelling vol 39 pp 1281ndash1290 2015
[20] Z Wei G Li and L Qi ldquoNew nonlinear conjugate gradientformulas for large-scale unconstrained optimization problemsrdquoApplied Mathematics and Computation vol 179 no 2 pp 407ndash430 2006
[21] H Huang Z Wei and Y Shengwei ldquoThe proof of the suffi-cient descent condition of the Wei-Yao-Liu conjugate gradientmethod under the strong Wolfe-Powell line searchrdquo AppliedMathematics and Computation vol 189 no 2 pp 1241ndash12452007
[22] G Yu Y Zhao and Z Wei ldquoA descent nonlinear conjugategradient method for large-scale unconstrained optimizationrdquoApplied Mathematics and Computation vol 187 no 2 pp 636ndash643 2007
[23] P Wolfe ldquoConvergence conditions for ascent methodsrdquo SIAMReview vol 11 no 2 pp 226ndash235 1969
[24] P Wolfe ldquoConvergence conditions for ascent methods ii somecorrectionsrdquo SIAM Review vol 13 no 2 pp 185ndash188 1971
[25] Y Dai and Y YuanNonlinear Conjugate Methods Science Pressof Shanghai Shanghai China 2000
[26] I Bongartz A R Conn N Gould and P L Toint ldquoCUTEconstrained and unconstrained testing environmentrdquo ACMTransactions on Mathematical Software vol 21 no 1 pp 123ndash160 1995
[27] J J More B S Garbow and K E Hillstrom ldquoTestingunconstrained optimization softwarerdquo ACM Transactions onMathematical Software vol 7 no 1 pp 17ndash41 1981
[28] E D Dolan and J J More ldquoBenchmarking optimization soft-ware with performance profilesrdquo Mathematical ProgrammingSeries B vol 91 no 2 pp 201ndash213 2002
which contradicts to Zoutendijk condition (29) This showsthat (32) holds The proof of the theorem is complete
From the proof of the above theorem we can concludethat any conjugate gradient method with the formula 120573NPRP
+
119896
and some certain step size technique which ensures thatZoutendijk condition (29) holds is globally convergent Inparticular the formula 120573NPRP
+
119896with the weak Wolfe condi-
tions can generate a globally convergent result
4 Numerical Results
All methods above are tested on 56 test problems wherethe former test problems 1ndash48 (from arwhead to woods) inTable 1 are taken from the CUTE library in Bongartz et al[26] and the others are taken fromMore et al [27]
DYHS 120573119896= max 0min 120573HS
119896 120573DY119896 (43)
is generated by Grippo and Lucidi [12]All codes were written in Matlab 75 and run on a HP
with 187GB RAM and Windows XP operating system Theparameters are 120590 = 01 120575 = 001 119906
1= 1 119906
2= 2 and 120582 = 03
Stop the iteration if criterion 119892119896 le 120598 = 10minus6 is satisfied
In Table 1 ldquoNamerdquo denotes the abbreviation of the testproblems ldquo119899rdquo denotes the dimension of the test problemsldquoItrNFNGrdquo denote the number of iteration function eval-uations and gradient evaluations respectively and ldquoTcpurdquodenotes the computing time of CPU for computing a testproblem (units second)
On the other hand to show the performance differenceclearly between the hJHJ hAN hDY and hHuS method weadopt the performance profiles given by Dolan and More[28] to compare the performance according to Itr NF NGand Tcpu respectively Benchmark results are generated byrunning a solver on a set P of problems and recordinginformation of interest such as NF and Tcpu Let S be theset of solvers in comparison Assume that S consists of 119899
119904
0 2 4 6 8 100
02
04
06
08
1
120591
120588(120591)
SWPHS+
SWPPRP+WWPNPRP+
WWPDYHS
Figure 1 Performance profiles on Tpu
0 2 4 6 8 10 120
02
04
06
08
1
120591
120588(120591)
SWPHS+
SWPPRP+WWPNPRP+
WWPDYHS
Figure 2 Performance profile on NF
solvers and P consists of 119899119901problems For each problem
119901 isin P and solver 119904 isin S denote 119905119901119904
by the computing time(or the number of function evaluation etc) required to solveproblem 119901 isin P by solver 119904 isin S and the comparison betweendifferent solvers is based on the performance ratio defined by
119903119901119904=
119905119901119904
min 119905119901119904 119904 isin S
(44)
Assume that a large enough parameter 119903119872ge 119903119901119904
for all119901 119904 is chosen and 119903
119901119904= 119903119872
if and only if solvers 119904 do notsolve problem 119901 Define
is within a factor 120591 isin 119877119899 The 120588119904is the (cumulative)
distribution function for the performance ratio The value of120588119904(1) is the probability that the solver will win over the rest of
the solversBased on the theory of the performance profile above
four performance figures that is Figures 1ndash4 can be gener-ated according to Table 1 From the four figures we can seethat the NPRP is superior to the other three CGMs on thetesting problems
5 Conclusion
In this paper we carefully studied the combination of thevariations of the formulas 120573FR
119896and 120573PRP
119896 We have found
that the new formula possesses the following features (1)120573NPRP
+
119896is a descent sequence without any line search (2) the
new method possesses the sufficient descent property and
converge globally (3) the strategy will restart the iterationautomatically along the steepest descent direction if a neg-ative value of 120573NPRP
119896occurs (4) the initial numerical results
are promising
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
The authors are very grateful to the anonymous referees fortheir useful suggestions and comments which improved thequality of this paper This work is supported by the NaturalScience Foundation of Shanxi Province (2012JQ9004) NewStar Team of Xirsquoan University of Post and Telecommucica-tions (XY201506) Science Foundation of Liaocheng Univer-sity (318011303) and General Project of the National SocialScience Foundation (15BGL014)
References
[1] G Zoutendijk ldquoNonlinear programming computational meth-odsrdquo in Integer and Non-Linear Programming J Abadie Ed pp37ndash86 North-Holland Publishing Amsterdam The Nerther-lands 1970
[2] M Al-Baali ldquoDescent property and global convergence of thefletcher-reeves method with inexact line searchrdquo IMA Journalof Numerical Analysis vol 5 no 1 pp 121ndash124 1985
[3] Y H Dai and Y Yuan ldquoAn efficient hybrid conjugate gradientmethod for unconstrained optimizationrdquo Annals of OperationsResearch vol 103 no 1ndash4 pp 33ndash47 2001
[4] Z X Wei G Y Li and L Q Qi ldquoGlobal convergence ofthe Polak-Ribiere-Polyak conjugate gradient method with anArmijo-type inexact line search for nonconvex unconstrainedoptimization problemsrdquo Mathematics of Computation vol 77no 264 pp 2173ndash2193 2008
[5] Y H Dai and Y Yuan ldquoA nonlinear conjugate gradient methodwith a strong global convergence propertyrdquo SIAM Journal onOptimization vol 10 no 1 pp 177ndash182 2000
[6] S W Yao Z XWei and H Huang ldquoA note aboutWYLrsquos conju-gate gradientmethod and its applicationsrdquoAppliedMathematicsand Computation vol 191 no 2 pp 381ndash388 2007
[7] X Z Jiang G D Ma and J B Jian ldquoA new global convergentconjugate gradient method with Wolfe line searchrdquo ChineseJournal of Engineering Mathematics vol 28 no 6 pp 779ndash7862011
[8] X Z Jiang L Han and J B Jian ldquoA globally convergentmixed conjugate gradient method with Wolfe line searchrdquoMathematica Numerica Sinica vol 34 no 1 pp 103ndash112 2012
[9] D Touati-Ahmed and C Storey ldquoEfficient hybrid conjugategradient techniquesrdquo Journal of OptimizationTheory and Appli-cations vol 64 no 2 pp 379ndash397 1990
[10] J C Gilbert and J Nocedal ldquoGlobal convergence properties ofconjugate gradient methods for optimizationrdquo SIAM Journal onOptimization vol 2 no 1 pp 21ndash42 1992
[11] Y F Hu and C Storey ldquoGlobal convergence result for conjugategradient methodsrdquo Journal of OptimizationTheory and Applica-tions vol 71 no 2 pp 399ndash405 1991
8 Mathematical Problems in Engineering
[12] L Grippo and S Lucidi ldquoA globally convergent version ofthe polak-ribiere conjugate gradient methodrdquo MathematicalProgramming Series B vol 78 no 3 pp 375ndash391 1997
[13] G H Yu L T Guan and G Y Li ldquoGlobal convergenceof modified Polak-Ribiere-Polyak conjugate gradient methodswith sufficient descent propertyrdquo Journal of Industrial andManagement Optimization vol 4 no 3 pp 565ndash579 2008
[14] N Andrei ldquoNumerical comparison of conjugate gradient algo-rithms for unconstrained optimizationrdquo Studies in Informaticsamp Control vol 16 no 4 pp 333ndash352 2007
[15] Y H Dai Analyses of conjugate gradient methods [PhD thesis]Mathematics and ScientificEngineering Computing ChineseAcademy of Sciences 1997
[16] G Yu L Guan and G Li ldquoGlobal convergence of modifiedPolak-Ribiere-Polyak conjugate gradient methods with suffi-cient descent propertyrdquo Journal of Industrial and ManagementOptimization vol 4 no 3 pp 565ndash579 2008
[17] Y-H Dai and C-X Kou ldquoA nonlinear conjugate gradientalgorithmwith an optimal property and an improved wolfe linesearchrdquo SIAM Journal on Optimization vol 23 no 1 pp 296ndash320 2013
[18] X-Z Jiang and J-B Jian ldquoTwo modified nonlinear conjugategradient methods with disturbance factors for unconstrainedoptimizationrdquoNonlinear Dynamics vol 77 no 1-2 pp 387ndash3972014
[19] J B Jian L Han and X Z Jiang ldquoA hybrid conjugate gradientmethodwith descent property for unconstrained optimizationrdquoApplied Mathematical Modelling vol 39 pp 1281ndash1290 2015
[20] Z Wei G Li and L Qi ldquoNew nonlinear conjugate gradientformulas for large-scale unconstrained optimization problemsrdquoApplied Mathematics and Computation vol 179 no 2 pp 407ndash430 2006
[21] H Huang Z Wei and Y Shengwei ldquoThe proof of the suffi-cient descent condition of the Wei-Yao-Liu conjugate gradientmethod under the strong Wolfe-Powell line searchrdquo AppliedMathematics and Computation vol 189 no 2 pp 1241ndash12452007
[22] G Yu Y Zhao and Z Wei ldquoA descent nonlinear conjugategradient method for large-scale unconstrained optimizationrdquoApplied Mathematics and Computation vol 187 no 2 pp 636ndash643 2007
[23] P Wolfe ldquoConvergence conditions for ascent methodsrdquo SIAMReview vol 11 no 2 pp 226ndash235 1969
[24] P Wolfe ldquoConvergence conditions for ascent methods ii somecorrectionsrdquo SIAM Review vol 13 no 2 pp 185ndash188 1971
[25] Y Dai and Y YuanNonlinear Conjugate Methods Science Pressof Shanghai Shanghai China 2000
[26] I Bongartz A R Conn N Gould and P L Toint ldquoCUTEconstrained and unconstrained testing environmentrdquo ACMTransactions on Mathematical Software vol 21 no 1 pp 123ndash160 1995
[27] J J More B S Garbow and K E Hillstrom ldquoTestingunconstrained optimization softwarerdquo ACM Transactions onMathematical Software vol 7 no 1 pp 17ndash41 1981
[28] E D Dolan and J J More ldquoBenchmarking optimization soft-ware with performance profilesrdquo Mathematical ProgrammingSeries B vol 91 no 2 pp 201ndash213 2002
is within a factor 120591 isin 119877119899 The 120588119904is the (cumulative)
distribution function for the performance ratio The value of120588119904(1) is the probability that the solver will win over the rest of
the solversBased on the theory of the performance profile above
four performance figures that is Figures 1ndash4 can be gener-ated according to Table 1 From the four figures we can seethat the NPRP is superior to the other three CGMs on thetesting problems
5 Conclusion
In this paper we carefully studied the combination of thevariations of the formulas 120573FR
119896and 120573PRP
119896 We have found
that the new formula possesses the following features (1)120573NPRP
+
119896is a descent sequence without any line search (2) the
new method possesses the sufficient descent property and
converge globally (3) the strategy will restart the iterationautomatically along the steepest descent direction if a neg-ative value of 120573NPRP
119896occurs (4) the initial numerical results
are promising
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
The authors are very grateful to the anonymous referees fortheir useful suggestions and comments which improved thequality of this paper This work is supported by the NaturalScience Foundation of Shanxi Province (2012JQ9004) NewStar Team of Xirsquoan University of Post and Telecommucica-tions (XY201506) Science Foundation of Liaocheng Univer-sity (318011303) and General Project of the National SocialScience Foundation (15BGL014)
References
[1] G Zoutendijk ldquoNonlinear programming computational meth-odsrdquo in Integer and Non-Linear Programming J Abadie Ed pp37ndash86 North-Holland Publishing Amsterdam The Nerther-lands 1970
[2] M Al-Baali ldquoDescent property and global convergence of thefletcher-reeves method with inexact line searchrdquo IMA Journalof Numerical Analysis vol 5 no 1 pp 121ndash124 1985
[3] Y H Dai and Y Yuan ldquoAn efficient hybrid conjugate gradientmethod for unconstrained optimizationrdquo Annals of OperationsResearch vol 103 no 1ndash4 pp 33ndash47 2001
[4] Z X Wei G Y Li and L Q Qi ldquoGlobal convergence ofthe Polak-Ribiere-Polyak conjugate gradient method with anArmijo-type inexact line search for nonconvex unconstrainedoptimization problemsrdquo Mathematics of Computation vol 77no 264 pp 2173ndash2193 2008
[5] Y H Dai and Y Yuan ldquoA nonlinear conjugate gradient methodwith a strong global convergence propertyrdquo SIAM Journal onOptimization vol 10 no 1 pp 177ndash182 2000
[6] S W Yao Z XWei and H Huang ldquoA note aboutWYLrsquos conju-gate gradientmethod and its applicationsrdquoAppliedMathematicsand Computation vol 191 no 2 pp 381ndash388 2007
[7] X Z Jiang G D Ma and J B Jian ldquoA new global convergentconjugate gradient method with Wolfe line searchrdquo ChineseJournal of Engineering Mathematics vol 28 no 6 pp 779ndash7862011
[8] X Z Jiang L Han and J B Jian ldquoA globally convergentmixed conjugate gradient method with Wolfe line searchrdquoMathematica Numerica Sinica vol 34 no 1 pp 103ndash112 2012
[9] D Touati-Ahmed and C Storey ldquoEfficient hybrid conjugategradient techniquesrdquo Journal of OptimizationTheory and Appli-cations vol 64 no 2 pp 379ndash397 1990
[10] J C Gilbert and J Nocedal ldquoGlobal convergence properties ofconjugate gradient methods for optimizationrdquo SIAM Journal onOptimization vol 2 no 1 pp 21ndash42 1992
[11] Y F Hu and C Storey ldquoGlobal convergence result for conjugategradient methodsrdquo Journal of OptimizationTheory and Applica-tions vol 71 no 2 pp 399ndash405 1991
8 Mathematical Problems in Engineering
[12] L Grippo and S Lucidi ldquoA globally convergent version ofthe polak-ribiere conjugate gradient methodrdquo MathematicalProgramming Series B vol 78 no 3 pp 375ndash391 1997
[13] G H Yu L T Guan and G Y Li ldquoGlobal convergenceof modified Polak-Ribiere-Polyak conjugate gradient methodswith sufficient descent propertyrdquo Journal of Industrial andManagement Optimization vol 4 no 3 pp 565ndash579 2008
[14] N Andrei ldquoNumerical comparison of conjugate gradient algo-rithms for unconstrained optimizationrdquo Studies in Informaticsamp Control vol 16 no 4 pp 333ndash352 2007
[15] Y H Dai Analyses of conjugate gradient methods [PhD thesis]Mathematics and ScientificEngineering Computing ChineseAcademy of Sciences 1997
[16] G Yu L Guan and G Li ldquoGlobal convergence of modifiedPolak-Ribiere-Polyak conjugate gradient methods with suffi-cient descent propertyrdquo Journal of Industrial and ManagementOptimization vol 4 no 3 pp 565ndash579 2008
[17] Y-H Dai and C-X Kou ldquoA nonlinear conjugate gradientalgorithmwith an optimal property and an improved wolfe linesearchrdquo SIAM Journal on Optimization vol 23 no 1 pp 296ndash320 2013
[18] X-Z Jiang and J-B Jian ldquoTwo modified nonlinear conjugategradient methods with disturbance factors for unconstrainedoptimizationrdquoNonlinear Dynamics vol 77 no 1-2 pp 387ndash3972014
[19] J B Jian L Han and X Z Jiang ldquoA hybrid conjugate gradientmethodwith descent property for unconstrained optimizationrdquoApplied Mathematical Modelling vol 39 pp 1281ndash1290 2015
[20] Z Wei G Li and L Qi ldquoNew nonlinear conjugate gradientformulas for large-scale unconstrained optimization problemsrdquoApplied Mathematics and Computation vol 179 no 2 pp 407ndash430 2006
[21] H Huang Z Wei and Y Shengwei ldquoThe proof of the suffi-cient descent condition of the Wei-Yao-Liu conjugate gradientmethod under the strong Wolfe-Powell line searchrdquo AppliedMathematics and Computation vol 189 no 2 pp 1241ndash12452007
[22] G Yu Y Zhao and Z Wei ldquoA descent nonlinear conjugategradient method for large-scale unconstrained optimizationrdquoApplied Mathematics and Computation vol 187 no 2 pp 636ndash643 2007
[23] P Wolfe ldquoConvergence conditions for ascent methodsrdquo SIAMReview vol 11 no 2 pp 226ndash235 1969
[24] P Wolfe ldquoConvergence conditions for ascent methods ii somecorrectionsrdquo SIAM Review vol 13 no 2 pp 185ndash188 1971
[25] Y Dai and Y YuanNonlinear Conjugate Methods Science Pressof Shanghai Shanghai China 2000
[26] I Bongartz A R Conn N Gould and P L Toint ldquoCUTEconstrained and unconstrained testing environmentrdquo ACMTransactions on Mathematical Software vol 21 no 1 pp 123ndash160 1995
[27] J J More B S Garbow and K E Hillstrom ldquoTestingunconstrained optimization softwarerdquo ACM Transactions onMathematical Software vol 7 no 1 pp 17ndash41 1981
[28] E D Dolan and J J More ldquoBenchmarking optimization soft-ware with performance profilesrdquo Mathematical ProgrammingSeries B vol 91 no 2 pp 201ndash213 2002
is within a factor 120591 isin 119877119899 The 120588119904is the (cumulative)
distribution function for the performance ratio The value of120588119904(1) is the probability that the solver will win over the rest of
the solversBased on the theory of the performance profile above
four performance figures that is Figures 1ndash4 can be gener-ated according to Table 1 From the four figures we can seethat the NPRP is superior to the other three CGMs on thetesting problems
5 Conclusion
In this paper we carefully studied the combination of thevariations of the formulas 120573FR
119896and 120573PRP
119896 We have found
that the new formula possesses the following features (1)120573NPRP
+
119896is a descent sequence without any line search (2) the
new method possesses the sufficient descent property and
converge globally (3) the strategy will restart the iterationautomatically along the steepest descent direction if a neg-ative value of 120573NPRP
119896occurs (4) the initial numerical results
are promising
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
The authors are very grateful to the anonymous referees fortheir useful suggestions and comments which improved thequality of this paper This work is supported by the NaturalScience Foundation of Shanxi Province (2012JQ9004) NewStar Team of Xirsquoan University of Post and Telecommucica-tions (XY201506) Science Foundation of Liaocheng Univer-sity (318011303) and General Project of the National SocialScience Foundation (15BGL014)
References
[1] G Zoutendijk ldquoNonlinear programming computational meth-odsrdquo in Integer and Non-Linear Programming J Abadie Ed pp37ndash86 North-Holland Publishing Amsterdam The Nerther-lands 1970
[2] M Al-Baali ldquoDescent property and global convergence of thefletcher-reeves method with inexact line searchrdquo IMA Journalof Numerical Analysis vol 5 no 1 pp 121ndash124 1985
[3] Y H Dai and Y Yuan ldquoAn efficient hybrid conjugate gradientmethod for unconstrained optimizationrdquo Annals of OperationsResearch vol 103 no 1ndash4 pp 33ndash47 2001
[4] Z X Wei G Y Li and L Q Qi ldquoGlobal convergence ofthe Polak-Ribiere-Polyak conjugate gradient method with anArmijo-type inexact line search for nonconvex unconstrainedoptimization problemsrdquo Mathematics of Computation vol 77no 264 pp 2173ndash2193 2008
[5] Y H Dai and Y Yuan ldquoA nonlinear conjugate gradient methodwith a strong global convergence propertyrdquo SIAM Journal onOptimization vol 10 no 1 pp 177ndash182 2000
[6] S W Yao Z XWei and H Huang ldquoA note aboutWYLrsquos conju-gate gradientmethod and its applicationsrdquoAppliedMathematicsand Computation vol 191 no 2 pp 381ndash388 2007
[7] X Z Jiang G D Ma and J B Jian ldquoA new global convergentconjugate gradient method with Wolfe line searchrdquo ChineseJournal of Engineering Mathematics vol 28 no 6 pp 779ndash7862011
[8] X Z Jiang L Han and J B Jian ldquoA globally convergentmixed conjugate gradient method with Wolfe line searchrdquoMathematica Numerica Sinica vol 34 no 1 pp 103ndash112 2012
[9] D Touati-Ahmed and C Storey ldquoEfficient hybrid conjugategradient techniquesrdquo Journal of OptimizationTheory and Appli-cations vol 64 no 2 pp 379ndash397 1990
[10] J C Gilbert and J Nocedal ldquoGlobal convergence properties ofconjugate gradient methods for optimizationrdquo SIAM Journal onOptimization vol 2 no 1 pp 21ndash42 1992
[11] Y F Hu and C Storey ldquoGlobal convergence result for conjugategradient methodsrdquo Journal of OptimizationTheory and Applica-tions vol 71 no 2 pp 399ndash405 1991
8 Mathematical Problems in Engineering
[12] L Grippo and S Lucidi ldquoA globally convergent version ofthe polak-ribiere conjugate gradient methodrdquo MathematicalProgramming Series B vol 78 no 3 pp 375ndash391 1997
[13] G H Yu L T Guan and G Y Li ldquoGlobal convergenceof modified Polak-Ribiere-Polyak conjugate gradient methodswith sufficient descent propertyrdquo Journal of Industrial andManagement Optimization vol 4 no 3 pp 565ndash579 2008
[14] N Andrei ldquoNumerical comparison of conjugate gradient algo-rithms for unconstrained optimizationrdquo Studies in Informaticsamp Control vol 16 no 4 pp 333ndash352 2007
[15] Y H Dai Analyses of conjugate gradient methods [PhD thesis]Mathematics and ScientificEngineering Computing ChineseAcademy of Sciences 1997
[16] G Yu L Guan and G Li ldquoGlobal convergence of modifiedPolak-Ribiere-Polyak conjugate gradient methods with suffi-cient descent propertyrdquo Journal of Industrial and ManagementOptimization vol 4 no 3 pp 565ndash579 2008
[17] Y-H Dai and C-X Kou ldquoA nonlinear conjugate gradientalgorithmwith an optimal property and an improved wolfe linesearchrdquo SIAM Journal on Optimization vol 23 no 1 pp 296ndash320 2013
[18] X-Z Jiang and J-B Jian ldquoTwo modified nonlinear conjugategradient methods with disturbance factors for unconstrainedoptimizationrdquoNonlinear Dynamics vol 77 no 1-2 pp 387ndash3972014
[19] J B Jian L Han and X Z Jiang ldquoA hybrid conjugate gradientmethodwith descent property for unconstrained optimizationrdquoApplied Mathematical Modelling vol 39 pp 1281ndash1290 2015
[20] Z Wei G Li and L Qi ldquoNew nonlinear conjugate gradientformulas for large-scale unconstrained optimization problemsrdquoApplied Mathematics and Computation vol 179 no 2 pp 407ndash430 2006
[21] H Huang Z Wei and Y Shengwei ldquoThe proof of the suffi-cient descent condition of the Wei-Yao-Liu conjugate gradientmethod under the strong Wolfe-Powell line searchrdquo AppliedMathematics and Computation vol 189 no 2 pp 1241ndash12452007
[22] G Yu Y Zhao and Z Wei ldquoA descent nonlinear conjugategradient method for large-scale unconstrained optimizationrdquoApplied Mathematics and Computation vol 187 no 2 pp 636ndash643 2007
[23] P Wolfe ldquoConvergence conditions for ascent methodsrdquo SIAMReview vol 11 no 2 pp 226ndash235 1969
[24] P Wolfe ldquoConvergence conditions for ascent methods ii somecorrectionsrdquo SIAM Review vol 13 no 2 pp 185ndash188 1971
[25] Y Dai and Y YuanNonlinear Conjugate Methods Science Pressof Shanghai Shanghai China 2000
[26] I Bongartz A R Conn N Gould and P L Toint ldquoCUTEconstrained and unconstrained testing environmentrdquo ACMTransactions on Mathematical Software vol 21 no 1 pp 123ndash160 1995
[27] J J More B S Garbow and K E Hillstrom ldquoTestingunconstrained optimization softwarerdquo ACM Transactions onMathematical Software vol 7 no 1 pp 17ndash41 1981
[28] E D Dolan and J J More ldquoBenchmarking optimization soft-ware with performance profilesrdquo Mathematical ProgrammingSeries B vol 91 no 2 pp 201ndash213 2002
[12] L Grippo and S Lucidi ldquoA globally convergent version ofthe polak-ribiere conjugate gradient methodrdquo MathematicalProgramming Series B vol 78 no 3 pp 375ndash391 1997
[13] G H Yu L T Guan and G Y Li ldquoGlobal convergenceof modified Polak-Ribiere-Polyak conjugate gradient methodswith sufficient descent propertyrdquo Journal of Industrial andManagement Optimization vol 4 no 3 pp 565ndash579 2008
[14] N Andrei ldquoNumerical comparison of conjugate gradient algo-rithms for unconstrained optimizationrdquo Studies in Informaticsamp Control vol 16 no 4 pp 333ndash352 2007
[15] Y H Dai Analyses of conjugate gradient methods [PhD thesis]Mathematics and ScientificEngineering Computing ChineseAcademy of Sciences 1997
[16] G Yu L Guan and G Li ldquoGlobal convergence of modifiedPolak-Ribiere-Polyak conjugate gradient methods with suffi-cient descent propertyrdquo Journal of Industrial and ManagementOptimization vol 4 no 3 pp 565ndash579 2008
[17] Y-H Dai and C-X Kou ldquoA nonlinear conjugate gradientalgorithmwith an optimal property and an improved wolfe linesearchrdquo SIAM Journal on Optimization vol 23 no 1 pp 296ndash320 2013
[18] X-Z Jiang and J-B Jian ldquoTwo modified nonlinear conjugategradient methods with disturbance factors for unconstrainedoptimizationrdquoNonlinear Dynamics vol 77 no 1-2 pp 387ndash3972014
[19] J B Jian L Han and X Z Jiang ldquoA hybrid conjugate gradientmethodwith descent property for unconstrained optimizationrdquoApplied Mathematical Modelling vol 39 pp 1281ndash1290 2015
[20] Z Wei G Li and L Qi ldquoNew nonlinear conjugate gradientformulas for large-scale unconstrained optimization problemsrdquoApplied Mathematics and Computation vol 179 no 2 pp 407ndash430 2006
[21] H Huang Z Wei and Y Shengwei ldquoThe proof of the suffi-cient descent condition of the Wei-Yao-Liu conjugate gradientmethod under the strong Wolfe-Powell line searchrdquo AppliedMathematics and Computation vol 189 no 2 pp 1241ndash12452007
[22] G Yu Y Zhao and Z Wei ldquoA descent nonlinear conjugategradient method for large-scale unconstrained optimizationrdquoApplied Mathematics and Computation vol 187 no 2 pp 636ndash643 2007
[23] P Wolfe ldquoConvergence conditions for ascent methodsrdquo SIAMReview vol 11 no 2 pp 226ndash235 1969
[24] P Wolfe ldquoConvergence conditions for ascent methods ii somecorrectionsrdquo SIAM Review vol 13 no 2 pp 185ndash188 1971
[25] Y Dai and Y YuanNonlinear Conjugate Methods Science Pressof Shanghai Shanghai China 2000
[26] I Bongartz A R Conn N Gould and P L Toint ldquoCUTEconstrained and unconstrained testing environmentrdquo ACMTransactions on Mathematical Software vol 21 no 1 pp 123ndash160 1995
[27] J J More B S Garbow and K E Hillstrom ldquoTestingunconstrained optimization softwarerdquo ACM Transactions onMathematical Software vol 7 no 1 pp 17ndash41 1981
[28] E D Dolan and J J More ldquoBenchmarking optimization soft-ware with performance profilesrdquo Mathematical ProgrammingSeries B vol 91 no 2 pp 201ndash213 2002