Skip to main content
  • Book
  • © 2002

Markov Processes and Controlled Markov Chains

Buy it now

Buying options

eBook USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Other ways to access

This is a preview of subscription content, log in via an institution to check for access.

Table of contents (32 chapters)

  1. Front Matter

    Pages i-x
  2. Markov Processes

    1. Front Matter

      Pages 1-1
    2. Convergence Property of Standard Transition Functions

      • Hanjun Zhang, Qixiang Mei, Xiang Lin, Zhenting Hou
      Pages 57-67
    3. Markov Skeleton Processes

      • Zhenting Hou, Zaiming Liu, Jiezhong Zou, Xuerong Chen
      Pages 69-92
  3. Controlled Markov Chains and Decision Processes

    1. Front Matter

      Pages 109-109
    2. Controlled Markov Chains with Utility Functions

      • Seiichi Iwamoto, Takayuki Ueno, Toshiharu Fujita
      Pages 135-149
    3. Classification Problems in MDPs

      • L. C. M. Kallenberg
      Pages 151-165
    4. Optimality Conditions for CTMDP with Average Cost Criterion

      • Xianping Guo, Weiping Zhu
      Pages 167-188
    5. Interval Methods for Uncertain Markov Decision Processes

      • Masami Kurano, Masami Yasuda, Jun-ichi Nakagami
      Pages 223-232
    6. Linear Program for Communicating MDPs with Multiple Constraints

      • Jerzy A. Filar, Xianping Guo
      Pages 245-254
    7. Optimal Switching Problem for Markov Chains

      • A. A. Yushkevich
      Pages 255-286
    8. Approximations of a Controlled Diffusion Model for Renewable Resource Exploitation

      • Sara Pasquali, Wolfgang J. Runggaldier
      Pages 287-302
  4. Stochastic Processes and Martingales

    1. Front Matter

      Pages 303-303

About this book

The general theory of stochastic processes and the more specialized theory of Markov processes evolved enormously in the second half of the last century. In parallel, the theory of controlled Markov chains (or Markov decision processes) was being pioneered by control engineers and operations researchers. Researchers in Markov processes and controlled Markov chains have been, for a long time, aware of the synergies between these two subject areas. However, this may be the first volume dedicated to highlighting these synergies and, almost certainly, it is the first volume that emphasizes the contributions of the vibrant and growing Chinese school of probability. The chapters that appear in this book reflect both the maturity and the vitality of modern day Markov processes and controlled Markov chains. They also will provide an opportunity to trace the connections that have emerged between the work done by members of the Chinese school of probability and the work done by the European, US, Central and South American and Asian scholars.

Editors and Affiliations

  • Research Department, Changsha Railway University, Changsha, China

    Zhenting Hou

  • School of Mathematics, University of South Australia, Mawson Lakes, Australia

    Jerzy A. Filar

  • School of Computing and Mathematical Sciences, University of Greenwich, London, UK

    Anyue Chen

Bibliographic Information

Buy it now

Buying options

eBook USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Other ways to access