Abstract:Quasi-Newton methods accelerate gradient methods for minimizing a function by approximating the inverse Hessian matrix of the function. Several papers in recent literature have dealt with the generation of classes of approximating matrices as a function of a scalar parameter. This paper derives necessary and sufficient conditions on the range of one such parameter to guarantee stability of the method. It further shows that the parameter effects only the length, not the direction, of the search vector at each step, and uses this result to derive several computational algorithms. The algorithms are evaluated on a series of test problems.
- J. G. P. Barnes, An algorithm for solving non-linear equations based on the secant method, Comput. J. 8 (1965), 66–72. MR 181101, DOI 10.1093/comjnl/8.2.113 W. C. Davidon, Variable Metric Method for Minimization, Argonne National Laboratory Report ANL-5990, November 1959.
- R. Fletcher and M. J. D. Powell, A rapidly convergent descent method for minimization, Comput. J. 6 (1963/64), 163–168. MR 152116, DOI 10.1093/comjnl/6.2.163
- Anthony Ralston, A first course in numerical analysis, McGraw-Hill Book Co., New York-Toronto-London, 1965. MR 0191070 E. M. Rosen, "A review of quasi-Newton methods in nonlinear equation solving and unconstrained optimization," Nat. Conference of the ACM, Proceedings of the Twenty-First Conference, Thompson Book Co., Washington, D.C., 1966, pp. 37–41.
- D. F. Shanno, Conditioning of quasi-Newton methods for function minimization, Math. Comp. 24 (1970), 647–656. MR 274029, DOI 10.1090/S0025-5718-1970-0274029-X
- © Copyright 1970 American Mathematical Society
- Journal: Math. Comp. 24 (1970), 657-664
- MSC: Primary 90.58
- DOI: https://doi.org/10.1090/S0025-5718-1970-0274030-6
- MathSciNet review: 0274030