Nonlinear Conjugate Gradient Methods for Unconstrained Optimization
|Keywords||Unconstrained optimization Conjugate gradient method Sufficient descentproperty Inexact line search Modified BFGS formula|
In this paper, we study the conjugate gradient method for unconstrainedoptimization and discuss their numerical results.At first, we make a review about the conjugate gradient methods. Based on theresearch on the conjugate gradient methods in the recent years, we introduce the sixclassical conjugate gradient methods、global convergence technology、sufficient descenttechnology、conjugacy technology、scaled technology、hybrid technology.1. Based on the modified BFGS formula and memory less technology, weintroduce a new conjugate gradient-like method. We scale the direction which isproduced by the method based on the sufficient descent technology by Cheng such thatthe search direction is sufficient descent independent of line search. Based on the globalconvergence technology by Gilbert and Nocedal, we prove the method is globalconvergent.2. We modify the CG-DESCENT method by truncating the modified BFGSformula and scale the search direction produced by the modified CG-DESCENTmethod. In particular, scale the direction is not to produce the sufficient descentproperty, but to improve the numerical results. In the numerical experiment, all codesare written in Fortran and the test functions what we select are all from CUTEr. Thenumerical results show that the two proposed methods are efficient. In particular, basedon the test functions what we select, the numerical results of the second method aresimilar to the CG-DESCENT method.