Abstract
In this paper, we present some numerical experiments with an algorithm that uses the partial separability of an optimization problem. This research is motivated by the very large number of minimization problems in many variables having that particular property. The results discussed in the paper cover both unconstrained and bound constrained cases, as well as numerical estimation of gradient vectors. It is shown that exploiting the present underlying structure can lead to efficient algorithms, especially when the problem dimension is large.
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
D.P. Bertsekas. Projected Newton Methods for Optimization Problems with Simple Constraints. SIAM Journal of Control and Optimization 20(2):221–246, 1982.
J. Cullum and R.K. Brayton. Some Remarks on the Symmetric Rank-One Update. Journal of Optimization Theory and Applications 29(4):493–519, 1979.
J.E. Dennis and H.H.W. Mei. Two New Unconstrained Optimization Algorithms Which Use Function and Gradient Values. Journal of Optimization Theory and Applications 28(4):453–482, 1979.
Ph.E. Gill and W. Murray. Conjugate Gradient Methods for Large Scale Nonlinear Optimization. Technical Report SOL 79-15, Dept. of Operations Research, Stanford University, Stanford, 1979.
Ph.E. Gill, W. Murray and M.H. Wright. Practical Optimization. Academic Press, London, 1981.
A. Griewank and Ph.L. Toint. Partitioned Variable Metric Updates for Large Structured Optimization Problems. Numerische Mathematik (39):119–137, 1982.
A. Griewank and Ph.L. Toint. Local Convergence Analysis for Partitioned Quasi-Newton Updates. Numerische Mathematik (39):429–448, 1982.
A. Griewank and Ph.L. Toint. On the Unconstrained Optimization of Partially Separable Functions. In M.J.D. Powell (editor), Nonlinear Optimization 1981. Academic Press, New-York, 1982.
A. Griewank and Ph.L. Toint. On the Existence of Convex Decompositions of Partially Separable Functions. Mathematical Programming to appear, 1983.
W. Hock and K. Schittkowski. Test Examples for Nonlinear Programming Codes. Lectures Notes in Economics and Mathematical Systems 187, Springer Verlag, Berlin, 1981.
H.Y. Huang. Unified Approach to Quadratically Convergent Algorithms For Function Minimization. Journal of Optimization Theory and Applications 5(6):405–423, 1970.
E. Marwil. Exploiting Sparsity in Newton-Like Methods. PhD thesis, Cornell University, Ithaca, New-York, 1978.
D.P. O’Leary. A Discrete Newton Algorithm For Minimizing A Function of Many Variables. Mathematical Programming 23:20–33, 1982.
M.J.D. Powell and Ph.L. Toint. The Shanno-Toint Procedure for Updating Sparse Symmetric Matrices. I.M.A. Journal of Numerical Analysis 1:403–413, 1981.
D. F. Shanno. On Variable Metric Methods for Sparse Hessians. Mathematics of Computation 34:499–514, 1980.
D.F. Shanno and K.H. Phua. Matrix Conditionning and Nonlinear Optimization. Mathematical Programming 14, 1978.
G.W. Stewart. A Modification of Davidon’s Minimization Method to Accept Difference Approximations of Derivatives. Journal of the ACM 14, 1967.
Ph.L. Toint. On Sparse And Symmetric Matrix Updating Subject To A Linear Equation. Mathematics of Computation 31:954–961, 1977.
Ph.L. Toint. On the Superlinear Convergence of an Algorithm for Solving a Sparse Minimization Problem. SIAM Journal on Numerical Analysis 16:1036–1045, 1979.
Editor information
Rights and permissions
Copyright information
© 1984 Springer-Verlag
About this paper
Cite this paper
Griewank, A., Toint, P.L. (1984). Numerical experiments with partially separable optimization problems. In: Griffiths, D.F. (eds) Numerical Analysis. Lecture Notes in Mathematics, vol 1066. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0099526
Download citation
DOI: https://doi.org/10.1007/BFb0099526
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-13344-5
Online ISBN: 978-3-540-38881-4
eBook Packages: Springer Book Archive