Markov Decision Processes with Their Applications

Author:   Qiying Hu ,  Wuyi Yue
Publisher:   Springer-Verlag New York Inc.
Edition:   Softcover reprint of hardcover 1st ed. 2008
Volume:   14
ISBN:  

9781441942388


Pages:   297
Publication Date:   19 November 2010
Format:   Paperback
Availability:   Out of print, replaced by POD   Availability explained
We will order this item for you from a manufatured on demand supplier.

Our Price $261.36 Quantity:  
Add to Cart

Share |

Markov Decision Processes with Their Applications


Add your own review!

Overview

Markov decision processes (MDPs), also called stochastic dynamic programming, were first studied in the 1960s. MDPs can be used to model and solve dynamic decision-making problems that are multi-period and occur in stochastic circumstances. There are three basic branches in MDPs: discrete-time MDPs, continuous-time MDPs and semi-Markov decision processes. Starting from these three branches, many generalized MDPs models have been applied to various practical problems. These models include partially observable MDPs, adaptive MDPs, MDPs in stochastic environments, and MDPs with multiple objectives, constraints or imprecise parameters. Markov Decision Processes With Their Applications examines MDPs and their applications in the optimal control of discrete event systems (DESs), optimal replacement, and optimal allocations in sequential online auctions. The book presents four main topics that are used to study optimal control problems: a new methodology for MDPs with discounted total reward criterion; transformation of continuous-time MDPs and semi-Markov decision processes into a discrete-time MDPs model, thereby simplifying the application of MDPs; MDPs in stochastic environments, which greatly extends the area where MDPs can be applied; applications of MDPs in optimal control of discrete event systems, optimal replacement, and optimal allocation in sequential online auctions. This book is intended for researchers, mathematicians, advanced graduate students, and engineers who are interested in optimal control, operation research, communications, manufacturing, economics, and electronic commerce.

Full Product Details

Author:   Qiying Hu ,  Wuyi Yue
Publisher:   Springer-Verlag New York Inc.
Imprint:   Springer-Verlag New York Inc.
Edition:   Softcover reprint of hardcover 1st ed. 2008
Volume:   14
Dimensions:   Width: 15.50cm , Height: 1.60cm , Length: 23.50cm
Weight:   0.548kg
ISBN:  

9781441942388


ISBN 10:   1441942386
Pages:   297
Publication Date:   19 November 2010
Audience:   Professional and scholarly ,  Professional & Vocational ,  Postgraduate, Research & Scholarly
Format:   Paperback
Publisher's Status:   Active
Availability:   Out of print, replaced by POD   Availability explained
We will order this item for you from a manufatured on demand supplier.

Table of Contents

Reviews

From the reviews: Markov decision processes (MDPs) are one of the most comprehensively investigated branches in mathematics. ! Very beneficial also are the notes and references at the end of each chapter. ! we can recommend the book ! for readers who are familiar with Markov decision theory and who are interested in a new approach to modelling, investigating and solving complex stochastic dynamic decision problems. (Peter Kochel, Mathematical Reviews, Issue 2009 c)


From the reviews: Markov decision processes (MDPs) are one of the most comprehensively investigated branches in mathematics. ... Very beneficial also are the notes and references at the end of each chapter. ... we can recommend the book ... for readers who are familiar with Markov decision theory and who are interested in a new approach to modelling, investigating and solving complex stochastic dynamic decision problems. (Peter Kochel, Mathematical Reviews, Issue 2009 c)


"From the reviews: ""Markov decision processes (MDPs) are one of the most comprehensively investigated branches in mathematics. … Very beneficial also are the notes and references at the end of each chapter. … we can recommend the book … for readers who are familiar with Markov decision theory and who are interested in a new approach to modelling, investigating and solving complex stochastic dynamic decision problems."" (Peter Köchel, Mathematical Reviews, Issue 2009 c)"


Author Information

Tab Content 6

Author Website:  

Customer Reviews

Recent Reviews

No review item found!

Add your own review!

Countries Available

All regions
Latest Reading Guide

wl

Shopping Cart
Your cart is empty
Shopping cart
Mailing List