Remote Access Mathematics of Computation
Green Open Access

Mathematics of Computation

ISSN 1088-6842(online) ISSN 0025-5718(print)



Self-scaling variable metric algorithms without line search for unconstrained minimization

Author: Shmuel S. Oren
Journal: Math. Comp. 27 (1973), 873-885
MSC: Primary 65K05
Corrigendum: Math. Comp. 28 (1974), 887.
Corrigendum: Math. Comp. 28 (1974), 887.
MathSciNet review: 0329259
Full-text PDF

Abstract | References | Similar Articles | Additional Information

Abstract: This paper introduces a new class of quasi-Newton algorithms for unconstrained minimization in which no line search is necessary and the inverse Hessian approximations are positive definite. These algorithms are based on a two-parameter family of rank two, updating formulae used earlier with line search in self-scaling variable metric algorithms. It is proved that, in a quadratic case, the new algorithms converge at least weak superlinearly. A special case of the above algorithms was implemented and tested numerically on several test functions. In this implementation, however, cubic interpolation was performed whenever the objective function was not satisfactorily decreased on the first "shot" (with unit step size), but this did not occur too often, except for very difficult functions. The numerical results indicate that the new algorithm is competitive and often superior to previous methods.

References [Enhancements On Off] (What's this?)

Similar Articles

Retrieve articles in Mathematics of Computation with MSC: 65K05

Retrieve articles in all journals with MSC: 65K05

Additional Information

Keywords: Function minimization, unconstrained minimization, quasi-Newton methods, variable metric methods, self-scaling variable metric algorithms, scaling, quasi-Newton algorithms with line search, gradient methods, Hessian matrix inverse approximation, conditioning of search methods, convergence rates
Article copyright: © Copyright 1973 American Mathematical Society

American Mathematical Society