Skip to:Content
|
Bottom
Cover image for Deterministic and stochastic optimal control
Title:
Deterministic and stochastic optimal control
Series:
Applications of mathematics 01
Publication Information:
New York : Springer-Verlag, 1975
ISBN:
9780387901558
Added Author:

Available:*

Library
Item Barcode
Call Number
Material Type
Item Category 1
Status
Searching...
30000001749781 QA402.3.F52 1975 Open Access Book Book
Searching...
Searching...
30000001319684 QA402.3.F52 1975 Open Access Book Book
Searching...

On Order

Summary

Summary

This book may be regarded as consisting of two parts. In Chapters I-IV we pre­ sent what we regard as essential topics in an introduction to deterministic optimal control theory. This material has been used by the authors for one semester graduate-level courses at Brown University and the University of Kentucky. The simplest problem in calculus of variations is taken as the point of departure, in Chapter I. Chapters II, III, and IV deal with necessary conditions for an opti­ mum, existence and regularity theorems for optimal controls, and the method of dynamic programming. The beginning reader may find it useful first to learn the main results, corollaries, and examples. These tend to be found in the earlier parts of each chapter. We have deliberately postponed some difficult technical proofs to later parts of these chapters. In the second part of the book we give an introduction to stochastic optimal control for Markov diffusion processes. Our treatment follows the dynamic pro­ gramming method, and depends on the intimate relationship between second­ order partial differential equations of parabolic type and stochastic differential equations. This relationship is reviewed in Chapter V, which may be read inde­ pendently of Chapters I-IV. Chapter VI is based to a considerable extent on the authors' work in stochastic control since 1961. It also includes two other topics important for applications, namely, the solution to the stochastic linear regulator and the separation principle.


Table of Contents

1 The Simplest Problem in Calculus of Variations
2 The Optimal Control Problem
3 Existence and Continuity Properties of Optimal Controls
4 Dynamic Programming
5 Stochastic Differential Equations and Markov Diffusion Processes
6 Optimal Control of Markov Diffusion Processes
Appendices
Go to:Top of Page