Skip to main content
Log in

Approximate solutions of the bellman equation of deterministic control theory

  • Published:
Applied Mathematics and Optimization Submit manuscript

Abstract

We consider an infinite horizon discounted optimal control problem and its time discretized approximation, and study the rate of convergence of the approximate solutions to the value function of the original problem. In particular we prove the rate is of order 1 as the discretization step tends to zero, provided a semiconcavity assumption is satisfied. We also characterize the limit of the optimal controls for the approximate problems within the framework of the theory of relaxed controls.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Berkovitz LD (1974) Optimal control theory. Springer-Verlag, New York

    Google Scholar 

  2. Capuzzo Dolcetta I (1983) On a discrete approximation of the Hamilton-Jacobi equation of dynamic programming. Appl Math Optim 10:367–377

    Google Scholar 

  3. Capuzzo Dolcetta I, Evans LC (To appear) Optimal switching for ordinary differential equations. SIAM J Control Optim

  4. Capuzzo Dolcetta I, Matzeu M (1981) On the dynamic programming inequalities associated with the optimal stopping problem in discrete and continuous time. Numer Funct Anal Op 3:425–450

    Google Scholar 

  5. Capuzzo Dolcetta I, Matzeu M (To appear) A constructive approach to the deterministic stopping time problem. Control and Cybernetics

  6. Capuzzo Dolcetta I, Matzeu M, Menaldi JL (To appear) On a system of first order quasi-variational inequalities connected with the optimal switching problem. Systems and Control Letters

  7. Crandall MG, Evans LC, Lions PL (To appear) Some properties of viscosity solutions of Hamilton-Jacobi equations. Trans AMS

  8. Crandall MG, Lions PL (1983) Viscosity solutions of Hamilton-Jacobi equations. Trans AMS 277:1–42

    Google Scholar 

  9. Crandall MG, Lions PL (To appear) Two approximations of solutions of Hamilton-Jacobi equations.

  10. Cullum J (1969) Discrete approximations to continuous optimal control problems. SIAM J Control 7:32–49

    Google Scholar 

  11. Cullum J (1971) An explicit procedure for discretizing continuous optimal control problems. J Op Theory Appl 8(1):15–34

    Google Scholar 

  12. Douglis A (1961) The continuous dependence of generalized solutions of nonlinear partial differential equations upon initial data. Comm Pure Appl math 14:267–284

    Google Scholar 

  13. Edwards RE (1965) Functional analysis, theory and applications. Holt, Rinehart and Winston, New York

    Google Scholar 

  14. Fleming WH, Rishel R (1975) Deterministic and stochastic optimal control. Springer-Verlag, New York

    Google Scholar 

  15. Henrici P (1962) Discrete variable methods in ordinary differential equations. J. Wiley, New York

    Google Scholar 

  16. Hrustalev MM (9173) Necessary and sufficient optimality conditions in the form of Bellman's equation. Soviet Math Dokl 19:1262–1266

    Google Scholar 

  17. Kruzkov SN (1975) Generalized solutions of Hamilton-Jacobi equations of eikonal type. Math USSR Sbornik 27:406–446

    Google Scholar 

  18. Lee EB, Markus L (1967) Foundations of optimal control theory. J. Wiley, New York

    Google Scholar 

  19. Lions PL (1982) Generalized solutions of Hamilton-Jacobi equations. Pitman, London

    Google Scholar 

  20. Malanowski K (1979) On convergence of finite difference approximation to optimal control problems for systems with control appearing linearly. Archiwum Automatyki i Telemachaniki 24(2):155–170

    Google Scholar 

  21. Souganidis PE (1983) PhD thesis. University of Wisconsin

  22. Souganidis PE (To appear) Existence of viscosity solutions of Hamilton-Jacobi equations

  23. Tartar L (1979) Compensated compactness and applications to partial differential equations. In: Knops RJ (ed) Nonlinear analysis and mechanics. Heriot-Watt Symposium, vol. 4. Pitman, London

    Google Scholar 

  24. Warga J (1972) Optimal control of differential and functional equations. Academic Press, New York

    Google Scholar 

  25. Young LC (1969) Lectures on the calculus of variations and optimal control theory. W.B. Saunders, Philadelphia

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Additional information

Communicated by W. Fleming

This work was done while the authors were visiting members of The Department of Mathematics of The University of Maryland at College Park.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Dolcetta, I.C., Ishii, H. Approximate solutions of the bellman equation of deterministic control theory. Appl Math Optim 11, 161–181 (1984). https://doi.org/10.1007/BF01442176

Download citation

  • Accepted:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF01442176

Keywords

Navigation