Foundations of Non-Stationary Dynamic Programming with Discrete Time Parameter

Foundations of Non-Stationary Dynamic Programming with Discrete Time Parameter PDF Author: Karl Hinderer
Publisher:
ISBN: 9780387049564
Category : Dynamic programming
Languages : en
Pages : 160

Get Book Here

Book Description

Foundations of Non-Stationary Dynamic Programming with Discrete Time Parameter

Foundations of Non-Stationary Dynamic Programming with Discrete Time Parameter PDF Author: Karl Hinderer
Publisher:
ISBN: 9780387049564
Category : Dynamic programming
Languages : en
Pages : 160

Get Book Here

Book Description


Foundations of Non-stationary Dynamic Programming with Discrete Time Parameter

Foundations of Non-stationary Dynamic Programming with Discrete Time Parameter PDF Author: K. Hinderer
Publisher: Springer Science & Business Media
ISBN: 3642462294
Category : Business & Economics
Languages : en
Pages : 171

Get Book Here

Book Description
The present work is an extended version of a manuscript of a course which the author taught at the University of Hamburg during summer 1969. The main purpose has been to give a rigorous foundation of stochastic dynamic programming in a manner which makes the theory easily applicable to many different practical problems. We mention the following features which should serve our purpose. a) The theory is built up for non-stationary models, thus making it possible to treat e.g. dynamic programming under risk, dynamic programming under uncertainty, Markovian models, stationary models, and models with finite horizon from a unified point of view. b) We use that notion of optimality (p-optimality) which seems to be most appropriate for practical purposes. c) Since we restrict ourselves to the foundations, we did not include practical problems and ways to their numerical solution, but we give (cf.section 8) a number of problems which show the diversity of structures accessible to non stationary dynamic programming. The main sources were the papers of Blackwell (65), Strauch (66) and Maitra (68) on stationary models with general state and action spaces and the papers of Dynkin (65), Hinderer (67) and Sirjaev (67) on non-stationary models. A number of results should be new, whereas most theorems constitute extensions (usually from stationary models to non-stationary models) or analogues to known results.

Decision & Control in Management Science

Decision & Control in Management Science PDF Author: Georges Zaccour
Publisher: Springer Science & Business Media
ISBN: 1475735618
Category : Business & Economics
Languages : en
Pages : 419

Get Book Here

Book Description
Decision & Control in Management Science analyzes emerging decision problems in the management and engineering sciences. It is divided into five parts. The first part explores methodological issues involved in the optimization of deterministic and stochastic dynamical systems. The second part describes approaches to the model energy and environmental systems and draws policy implications related to the mitigation of pollutants. The third part applies quantitative techniques to problems in finance and economics, such as hedging of options, inflation targeting, and equilibrium asset pricing. The fourth part considers a series of problems in production systems. Optimization methods are put forward to provide optimal policies in areas such as inventory management, transfer-line, flow-shop and other industrial problems. The last part covers game theory. Chapters range from theoretical issues to applications in politics and interactions in franchising systems. Decision & Control in Management Science is an excellent reference covering methodological issues and applications in operations research, optimal control, and dynamic games.

New Trends in Dynamic Games and Applications

New Trends in Dynamic Games and Applications PDF Author: Jan G. Olsder
Publisher: Springer Science & Business Media
ISBN: 1461242746
Category : Mathematics
Languages : en
Pages : 478

Get Book Here

Book Description
The theory of dynamic games is very rich in nature and very much alive! If the reader does not already agree with this statement, I hope he/she will surely do so after having consulted the contents of the current volume. The activities which fall under the heading of 'dynamic games' cannot easily be put into one scientific discipline. On the theoretical side one deals with differential games, difference games (the underlying models are described by differential, respec tively difference equations) and games based on Markov chains, with determin istic and stochastic games, zero-sum and nonzero-sum games, two-player and many-player games - all under various forms of equilibria. On the practical side, one sees applications to economics (stimulated by the recent Nobel prize for economics which went to three prominent scientists in game theory), biology, management science, and engineering. The contents of this volume are primarily based on selected presentations made at the Sixth International Symposium on Dynamic Games and Applica tions, held in St Jovite, Quebec, Canada, 13-15 July 1994. Every paper that appears in this volume has passed through a stringent reviewing process, as is the case with publications for archival technical journals. This conference, as well as its predecessor which was held in Grimentz, 1992, took place under the auspices of the International Society of Dynamic Games (ISDG), established in 1990. One of the activities of the ISDG is the publication of these Annals. The contributions in this volume have been grouped around five themes.

Handbook of Markov Decision Processes

Handbook of Markov Decision Processes PDF Author: Eugene A. Feinberg
Publisher: Springer Science & Business Media
ISBN: 1461508053
Category : Business & Economics
Languages : en
Pages : 560

Get Book Here

Book Description
Eugene A. Feinberg Adam Shwartz This volume deals with the theory of Markov Decision Processes (MDPs) and their applications. Each chapter was written by a leading expert in the re spective area. The papers cover major research areas and methodologies, and discuss open questions and future research directions. The papers can be read independently, with the basic notation and concepts ofSection 1.2. Most chap ters should be accessible by graduate or advanced undergraduate students in fields of operations research, electrical engineering, and computer science. 1.1 AN OVERVIEW OF MARKOV DECISION PROCESSES The theory of Markov Decision Processes-also known under several other names including sequential stochastic optimization, discrete-time stochastic control, and stochastic dynamic programming-studiessequential optimization ofdiscrete time stochastic systems. The basic object is a discrete-time stochas tic system whose transition mechanism can be controlled over time. Each control policy defines the stochastic process and values of objective functions associated with this process. The goal is to select a "good" control policy. In real life, decisions that humans and computers make on all levels usually have two types ofimpacts: (i) they cost orsavetime, money, or other resources, or they bring revenues, as well as (ii) they have an impact on the future, by influencing the dynamics. In many situations, decisions with the largest immediate profit may not be good in view offuture events. MDPs model this paradigm and provide results on the structure and existence of good policies and on methods for their calculation.

Zero-Sum Discrete-Time Markov Games with Unknown Disturbance Distribution

Zero-Sum Discrete-Time Markov Games with Unknown Disturbance Distribution PDF Author: J. Adolfo Minjárez-Sosa
Publisher: Springer Nature
ISBN: 3030357201
Category : Mathematics
Languages : en
Pages : 129

Get Book Here

Book Description
This SpringerBrief deals with a class of discrete-time zero-sum Markov games with Borel state and action spaces, and possibly unbounded payoffs, under discounted and average criteria, whose state process evolves according to a stochastic difference equation. The corresponding disturbance process is an observable sequence of independent and identically distributed random variables with unknown distribution for both players. Unlike the standard case, the game is played over an infinite horizon evolving as follows. At each stage, once the players have observed the state of the game, and before choosing the actions, players 1 and 2 implement a statistical estimation process to obtain estimates of the unknown distribution. Then, independently, the players adapt their decisions to such estimators to select their actions and construct their strategies. This book presents a systematic analysis on recent developments in this kind of games. Specifically, the theoretical foundations on the procedures combining statistical estimation and control techniques for the construction of strategies of the players are introduced, with illustrative examples. In this sense, the book is an essential reference for theoretical and applied researchers in the fields of stochastic control and game theory, and their applications.

Statistics, Probability, and Game Theory

Statistics, Probability, and Game Theory PDF Author: David Blackwell
Publisher: IMS
ISBN: 9780940600423
Category : Mathematics
Languages : en
Pages : 428

Get Book Here

Book Description
Most of the 26 papers are research reports on probability, statistics, gambling, game theory, Markov decision processes, set theory, and logic. But they also include reviews on comparing experiments, games of timing, merging opinions, associated memory models, and SPLIF's; historical views of Carnap, von Mises, and the Berkeley Statistics Department; and a brief history, appreciation, and bibliography of Berkeley professor Blackwell. A sampling of titles turns up The Hamiltonian Cycle Problem and Singularly Perturbed Markov Decision Process, A Pathwise Approach to Dynkin Games, The Redistribution of Velocity: Collision and Transformations, Casino Winnings at Blackjack, and Randomness and the Foundations of Probability. No index. Annotation copyrighted by Book News, Inc., Portland, OR

Methods and Applications of Statistics in Business, Finance, and Management Science

Methods and Applications of Statistics in Business, Finance, and Management Science PDF Author: Narayanaswamy Balakrishnan
Publisher: John Wiley & Sons
ISBN: 0470405104
Category : Mathematics
Languages : en
Pages : 735

Get Book Here

Book Description
Inspired by the Encyclopedia of Statistical Sciences, Second Edition, this volume presents the tools and techniques that are essential for carrying out best practices in the modern business world The collection and analysis of quantitative data drives some of the most important conclusions that are drawn in today's business world, such as the preferences of a customer base, the quality of manufactured products, the marketing of products, and the availability of financial resources. As a result, it is essential for individuals working in this environment to have the knowledge and skills to interpret and use statistical techniques in various scenarios. Addressing this need, Methods and Applications of Statistics in Business, Finance, and Management Science serves as a single, one-of-a-kind resource that guides readers through the use of common statistical practices by presenting real-world applications from the fields of business, economics, finance, operations research, and management science. Uniting established literature with the latest research, this volume features classic articles from the acclaimed Encyclopedia of Statistical Sciences, Second Edition along with brand-new contributions written by today's leading academics and practitioners. The result is a compilation that explores classic methodology and new topics, including: Analytical methods for risk management Statistical modeling for online auctions Ranking and selection in mutual funds Uses of Black-Scholes formula in finance Data mining in prediction markets From auditing and marketing to stock market price indices and banking, the presented literature sheds light on the use of quantitative methods in research relating to common financial applications. In addition, the book supplies insight on common uses of statistical techniques such as Bayesian methods, optimization, simulation, forecasting, mathematical modeling, financial time series, and data mining in modern research. Providing a blend of traditional methodology and the latest research, Methods and Applications of Statistics in Business, Finance, and Management Science is an excellent reference for researchers, managers, consultants, and students in the fields of business, management science, operations research, supply chain management, mathematical finance, and economics who must understand statistical literature and carry out quantitative practices to make smart business decisions in their everyday work.

Advances in Dynamic Games and Applications

Advances in Dynamic Games and Applications PDF Author: Eitan Altmann
Publisher: Springer Science & Business Media
ISBN: 1461201551
Category : Mathematics
Languages : en
Pages : 343

Get Book Here

Book Description
Game theory is a rich and active area of research of which this new volume of the Annals of the International Society of Dynamic Games is yet fresh evidence. Since the second half of the 20th century, the area of dynamic games has man aged to attract outstanding mathematicians, who found exciting open questions requiring tools from a wide variety of mathematical disciplines; economists, so cial and political scientists, who used game theory to model and study competition and cooperative behavior; and engineers, who used games in computer sciences, telecommunications, and other areas. The contents of this volume are primarily based on selected presentation made at the 8th International Symposium of Dynamic Games and Applications, held in Chateau Vaalsbroek, Maastricht, the Netherlands, July 5-8, 1998; this conference took place under the auspices of the International Society of Dynamic Games (ISDG), established in 1990. The conference has been cosponsored by the Control Systems Society of the IEEE, IFAC (International Federation of Automatic Con trol), INRIA (Institute National de Recherche en Informatique et Automatique), and the University of Maastricht. One ofthe activities of the ISDG is the publica tion of the Annals. Every paper that appears in this volume has passed through a stringent reviewing process, as is the case with publications for archival journals.

Markov Decision Processes with Applications to Finance

Markov Decision Processes with Applications to Finance PDF Author: Nicole Bäuerle
Publisher: Springer Science & Business Media
ISBN: 3642183247
Category : Mathematics
Languages : en
Pages : 393

Get Book Here

Book Description
The theory of Markov decision processes focuses on controlled Markov chains in discrete time. The authors establish the theory for general state and action spaces and at the same time show its application by means of numerous examples, mostly taken from the fields of finance and operations research. By using a structural approach many technicalities (concerning measure theory) are avoided. They cover problems with finite and infinite horizons, as well as partially observable Markov decision processes, piecewise deterministic Markov decision processes and stopping problems. The book presents Markov decision processes in action and includes various state-of-the-art applications with a particular view towards finance. It is useful for upper-level undergraduates, Master's students and researchers in both applied probability and finance, and provides exercises (without solutions).