A quasi-Newton method with no derivatives
HTML articles powered by AMS MathViewer
- by John Greenstadt PDF
- Math. Comp. 26 (1972), 145-166 Request permission
Abstract:
The Davidon formula and others of the “quasi-Newton” class, which are used in the unconstrained minimization of a function $f$, provide a (generally) convergent sequence of approximations to the Hessian of $f$. These formulas, however, require the independent calculation of the gradient of $f$. In this paper, a set of new formulas is derived—using a previously described variational approach—which successively approximates the gradient as well as the Hessian, and uses only function values. These formulas are incorporated into an algorithm which, although still crude, works quite well for various standard test functions. Extensive numerical results are presented.References
- William C. Davidon, Variable metric method for minimization, SIAM J. Optim. 1 (1991), no. 1, 1–17. MR 1094786, DOI 10.1137/0801001
- R. Fletcher and M. J. D. Powell, A rapidly convergent descent method for minimization, Comput. J. 6 (1963/64), 163–168. MR 152116, DOI 10.1093/comjnl/6.2.163
- C. G. Broyden, Quasi-Newton methods and their application to function minimisation, Math. Comp. 21 (1967), 368–381. MR 224273, DOI 10.1090/S0025-5718-1967-0224273-2
- G. W. Stewart III, A modification of Davidon’s minimization method to accept difference approximations of derivatives, J. Assoc. Comput. Mach. 14 (1967), 72–83. MR 216737, DOI 10.1145/321371.321377
- D. Goldfarb, Sufficient conditions for the convergence of a variable metric algorithm, Optimization (Sympos., Univ. Keele, Keele, 1968) Academic Press, London, 1969, pp. 273–281. MR 0287905
- G. P. McCormick and J. D. Pearson, Variable metric methods and unconstrained optimization, Optimization (Sympos., Univ. Keele, Keele, 1968) Academic Press, London, 1969, pp. 307–325. MR 0267905
- J. Greenstadt, Variations on variable-metric methods. (With discussion), Math. Comp. 24 (1970), 1–22. MR 258248, DOI 10.1090/S0025-5718-1970-0258248-4
- H. H. Rosenbrock, An automatic method for finding the greatest or least value of a function, Comput. J. 3 (1960/61), 175–184. MR 136042, DOI 10.1093/comjnl/3.3.175 E. M. L. Beale, On an Iterative Method for Finding a Local Minimum of a Function of More than One Variable, Stat. Tech. Res. Grp., Technical Report #25, Princeton University, Princeton, N. J., 1958. M. J. D. Powell, “An iterative method for finding stationary values of a function of several variables,” Comput. J., v. 5, 1962, pp. 147-151.
- M. J. D. Powell, An efficient method for finding the minimum of a function of several variables without calculating derivatives, Comput. J. 7 (1964), 155–162. MR 187376, DOI 10.1093/comjnl/7.2.155
- Thomas P. Vogl, Recent advances in optimization techniques, John Wiley & Sons, Inc., New York-London-Sydney, 1966. Edited by Abraham Lavi and Thomas P. Vogl. MR 0198722
- Donald Goldfarb, A family of variable-metric methods derived by variational means, Math. Comp. 24 (1970), 23–26. MR 258249, DOI 10.1090/S0025-5718-1970-0258249-6
- J. Kowalik and M. R. Osborne, Methods for unconstrained optimization problems, Modern Analytic and Computational Methods in Science and Mathematics, No. 13, American Elsevier Publishing Co., Inc., New York, 1968. MR 0323104 A. R. Colville, A Comparative Study of Non-Linear Programming Codes, IBM Tech. Rep. #320-2949, 1968.
- Anthony V. Fiacco and Garth P. McCormick, Nonlinear programming: Sequential unconstrained minimization techniques, John Wiley & Sons, Inc., New York-London-Sydney, 1968. MR 0243831
Additional Information
- © Copyright 1972 American Mathematical Society
- Journal: Math. Comp. 26 (1972), 145-166
- MSC: Primary 65K05
- DOI: https://doi.org/10.1090/S0025-5718-1972-0305592-X
- MathSciNet review: 0305592