Publications Meetings The Profession Membership Programs Math Samplings Policy & Advocacy In the News About the AMS

   
Mobile Device Pairing
Green Open Access
Mathematics of Computation
Mathematics of Computation
ISSN 1088-6842(online) ISSN 0025-5718(print)

 

Optimal conditioning of quasi-Newton methods


Authors: D. F. Shanno and P. C. Kettler
Journal: Math. Comp. 24 (1970), 657-664
MSC: Primary 90.58
MathSciNet review: 0274030
Full-text PDF Free Access

Abstract | References | Similar Articles | Additional Information

Abstract: Quasi-Newton methods accelerate gradient methods for minimizing a function by approximating the inverse Hessian matrix of the function. Several papers in recent literature have dealt with the generation of classes of approximating matrices as a function of a scalar parameter. This paper derives necessary and sufficient conditions on the range of one such parameter to guarantee stability of the method. It further shows that the parameter effects only the length, not the direction, of the search vector at each step, and uses this result to derive several computational algorithms. The algorithms are evaluated on a series of test problems.


References [Enhancements On Off] (What's this?)


Similar Articles

Retrieve articles in Mathematics of Computation with MSC: 90.58

Retrieve articles in all journals with MSC: 90.58


Additional Information

DOI: http://dx.doi.org/10.1090/S0025-5718-1970-0274030-6
PII: S 0025-5718(1970)0274030-6
Keywords: Function minimization, quasi-Newton methods, variable metric methods, gradient search, steepest descent methods, stability of search methods, conditioning of search method, Hessian matrix inverse approximations, quadratic convergence
Article copyright: © Copyright 1970 American Mathematical Society