Series: Stochastic Modelling and Applied Probability, Vol. 25
Fleming, Wendell H., Soner, Halil Mete
2nd ed. 2006, XVII, 429 p.
Springer eBooks may be purchased by end-customers only and are sold without copy protection (DRM free). Instead, all eBooks include personalized watermarks. This means you can read the Springer eBooks across numerous devices such as Laptops, eReaders, and tablets.
You can pay for Springer eBooks with Visa, Mastercard, American Express or Paypal.
After the purchase you can directly download the eBook file or read it online in our Springer eBook Reader. Furthermore your eBook will be stored in your MySpringer account. So you can always re-download your eBooks.
(net)
price for USA
ISBN 978-0-387-31071-8
digitally watermarked, no DRM
Included Format: PDF
download immediately after purchase
Hardcover version
You can pay for Springer Books with Visa, Mastercard, American Express or Paypal.
Standard shipping is free of charge for individual customers.
(net)
price for USA
ISBN 978-0-387-26045-7
free shipping for individuals worldwide
usually dispatched within 3 to 5 business days
Softcover (also known as softback) version.
You can pay for Springer Books with Visa, Mastercard, American Express or Paypal.
Standard shipping is free of charge for individual customers.
(net)
price for USA
ISBN 978-1-4419-2078-2
free shipping for individuals worldwide
usually dispatched within 3 to 5 business days
This book is intended as an introduction to optimal stochastic control for continuous time Markov processes and to the theory of viscosity solutions. Stochastic control problems are treated using the dynamic programming approach. The authors approach stochastic control problems by the method of dynamic programming. The fundamental equation of dynamic programming is a nonlinear evolution equation for the value function. For controlled Markov diffusion processes, this becomes a nonlinear partial differential equation of second order, called a Hamilton-Jacobi-Bellman (HJB) equation. Typically, the value function is not smooth enough to satisfy the HJB equation in a classical sense. Viscosity solutions provide framework in which to study HJB equations, and to prove continuous dependence of solutions on problem data. The theory is illustrated by applications from engineering, management science, and financial economics.
In this second edition, new material on applications to mathematical finance has been added. Concise introductions to risk-sensitive control theory, nonlinear H-infinity control and differential games are also included.
Review of the earlier edition:
"This book is highly recommended to anyone who wishes to learn the dinamic principle applied to optimal stochastic control for diffusion processes. Without any doubt, this is a fine book and most likely it is going to become a classic on the area... ."
SIAM Review, 1994
Content Level » Research
Keywords » Markov Chain - Markov process - Optimal control - control - control theory - optimization - programming
Related subjects » Applications - Operations Research & Decision Theory - Probability Theory and Stochastic Processes - Quantitative Finance - Robotics
Get alerted on new Springer publications in the subject area of Probability Theory and Stochastic Processes.