Summary.
This paper provides several new properties of the nonlinear conjugate gradient method in [5]. Firstly, the method is proved to have a certain self-adjusting property that is independent of the line search and the function convexity. Secondly, under mild assumptions on the objective function, the method is shown to be globally convergent with a variety of line searches. Thirdly, we find that instead of the negative gradient direction, the search direction defined by the nonlinear conjugate gradient method in [5] can be used to restart any optimization method while guaranteeing the global convergence of the method. Some numerical results are also presented.
Similar content being viewed by others
Author information
Authors and Affiliations
Additional information
Received March 12, 1999 / Revised version received April 25, 2000 / Published online February 5, 2001
Rights and permissions
About this article
Cite this article
Dai, YH. New properties of a nonlinear conjugate gradient method. Numer. Math. 89, 83–98 (2001). https://doi.org/10.1007/PL00005464
Issue Date:
DOI: https://doi.org/10.1007/PL00005464