Author: Frank Schneider
Publisher: Logos Verlag Berlin GmbH
ISBN: 3832536337
Category : Business & Economics
Languages : en
Pages : 172
Book Description
Many tasks in operations management require the solution of complex optimization problems. Problems in which decisions are taken sequentially over time can be modeled and solved by dynamic programming. Real-world dynamic programming problems, however, exhibit complexity that cannot be handled by conventional solution techniques. This complexity may stem from large state and solution spaces, huge sets of possible actions, non-convexities in the objective function, and uncertainty. In this book, three highly complex real-world problems from the domain of operations management are modeled and solved by newly developed solution techniques based on stochastic dynamic programming. First, the problem of optimally scheduling participating demand units in an energy transmission network is considered. These units are scheduled such that total cost of supplying demand for electric energy is minimized under uncertainty in demand and generation. Second, the integrated problem of investment in and optimal operations of a network of battery swap stations under uncertain demand and energy prices is modeled and solved. Third, the inventory control problem of a multi-channel retailer selling through independent sales channels is modeled and optimality conditions for replenishment policies of simple structure are proven. This book introduces efficient approximation techniques based on approximate dynamic programming (ADP) and extends existing proximal point algorithms to the stochastic case. The methods are applicable to a wide variety of dynamic programming problems of high dimension.
Advances in Stochastic Dynamic Programming for Operations Management
Author: Frank Schneider
Publisher: Logos Verlag Berlin GmbH
ISBN: 3832536337
Category : Business & Economics
Languages : en
Pages : 172
Book Description
Many tasks in operations management require the solution of complex optimization problems. Problems in which decisions are taken sequentially over time can be modeled and solved by dynamic programming. Real-world dynamic programming problems, however, exhibit complexity that cannot be handled by conventional solution techniques. This complexity may stem from large state and solution spaces, huge sets of possible actions, non-convexities in the objective function, and uncertainty. In this book, three highly complex real-world problems from the domain of operations management are modeled and solved by newly developed solution techniques based on stochastic dynamic programming. First, the problem of optimally scheduling participating demand units in an energy transmission network is considered. These units are scheduled such that total cost of supplying demand for electric energy is minimized under uncertainty in demand and generation. Second, the integrated problem of investment in and optimal operations of a network of battery swap stations under uncertain demand and energy prices is modeled and solved. Third, the inventory control problem of a multi-channel retailer selling through independent sales channels is modeled and optimality conditions for replenishment policies of simple structure are proven. This book introduces efficient approximation techniques based on approximate dynamic programming (ADP) and extends existing proximal point algorithms to the stochastic case. The methods are applicable to a wide variety of dynamic programming problems of high dimension.
Publisher: Logos Verlag Berlin GmbH
ISBN: 3832536337
Category : Business & Economics
Languages : en
Pages : 172
Book Description
Many tasks in operations management require the solution of complex optimization problems. Problems in which decisions are taken sequentially over time can be modeled and solved by dynamic programming. Real-world dynamic programming problems, however, exhibit complexity that cannot be handled by conventional solution techniques. This complexity may stem from large state and solution spaces, huge sets of possible actions, non-convexities in the objective function, and uncertainty. In this book, three highly complex real-world problems from the domain of operations management are modeled and solved by newly developed solution techniques based on stochastic dynamic programming. First, the problem of optimally scheduling participating demand units in an energy transmission network is considered. These units are scheduled such that total cost of supplying demand for electric energy is minimized under uncertainty in demand and generation. Second, the integrated problem of investment in and optimal operations of a network of battery swap stations under uncertain demand and energy prices is modeled and solved. Third, the inventory control problem of a multi-channel retailer selling through independent sales channels is modeled and optimality conditions for replenishment policies of simple structure are proven. This book introduces efficient approximation techniques based on approximate dynamic programming (ADP) and extends existing proximal point algorithms to the stochastic case. The methods are applicable to a wide variety of dynamic programming problems of high dimension.
Stochastic Dynamic Programming and the Control of Queueing Systems
Author: Linn I. Sennott
Publisher: John Wiley & Sons
ISBN: 9780471161202
Category : Mathematics
Languages : en
Pages : 360
Book Description
Eine Zusammenstellung der Grundlagen der stochastischen dynamischen Programmierung (auch als Markov-Entscheidungsprozeß oder Markov-Ketten bekannt), deren Schwerpunkt auf der Anwendung der Queueing-Theorie liegt. Theoretische und programmtechnische Aspekte werden sinnvoll verknüpft; insgesamt neun numerische Programme zur Queueing-Steuerung werden im Text ausführlich diskutiert. Ergänzendes Material kann vom zugehörigen ftp-Server abgerufen werden. (12/98)
Publisher: John Wiley & Sons
ISBN: 9780471161202
Category : Mathematics
Languages : en
Pages : 360
Book Description
Eine Zusammenstellung der Grundlagen der stochastischen dynamischen Programmierung (auch als Markov-Entscheidungsprozeß oder Markov-Ketten bekannt), deren Schwerpunkt auf der Anwendung der Queueing-Theorie liegt. Theoretische und programmtechnische Aspekte werden sinnvoll verknüpft; insgesamt neun numerische Programme zur Queueing-Steuerung werden im Text ausführlich diskutiert. Ergänzendes Material kann vom zugehörigen ftp-Server abgerufen werden. (12/98)
Approximate Dynamic Programming
Author: Warren B. Powell
Publisher: John Wiley & Sons
ISBN: 0470182954
Category : Mathematics
Languages : en
Pages : 487
Book Description
A complete and accessible introduction to the real-world applications of approximate dynamic programming With the growing levels of sophistication in modern-day operations, it is vital for practitioners to understand how to approach, model, and solve complex industrial problems. Approximate Dynamic Programming is a result of the author's decades of experience working in large industrial settings to develop practical and high-quality solutions to problems that involve making decisions in the presence of uncertainty. This groundbreaking book uniquely integrates four distinct disciplines—Markov design processes, mathematical programming, simulation, and statistics—to demonstrate how to successfully model and solve a wide range of real-life problems using the techniques of approximate dynamic programming (ADP). The reader is introduced to the three curses of dimensionality that impact complex problems and is also shown how the post-decision state variable allows for the use of classical algorithmic strategies from operations research to treat complex stochastic optimization problems. Designed as an introduction and assuming no prior training in dynamic programming of any form, Approximate Dynamic Programming contains dozens of algorithms that are intended to serve as a starting point in the design of practical solutions for real problems. The book provides detailed coverage of implementation challenges including: modeling complex sequential decision processes under uncertainty, identifying robust policies, designing and estimating value function approximations, choosing effective stepsize rules, and resolving convergence issues. With a focus on modeling and algorithms in conjunction with the language of mainstream operations research, artificial intelligence, and control theory, Approximate Dynamic Programming: Models complex, high-dimensional problems in a natural and practical way, which draws on years of industrial projects Introduces and emphasizes the power of estimating a value function around the post-decision state, allowing solution algorithms to be broken down into three fundamental steps: classical simulation, classical optimization, and classical statistics Presents a thorough discussion of recursive estimation, including fundamental theory and a number of issues that arise in the development of practical algorithms Offers a variety of methods for approximating dynamic programs that have appeared in previous literature, but that have never been presented in the coherent format of a book Motivated by examples from modern-day operations research, Approximate Dynamic Programming is an accessible introduction to dynamic modeling and is also a valuable guide for the development of high-quality solutions to problems that exist in operations research and engineering. The clear and precise presentation of the material makes this an appropriate text for advanced undergraduate and beginning graduate courses, while also serving as a reference for researchers and practitioners. A companion Web site is available for readers, which includes additional exercises, solutions to exercises, and data sets to reinforce the book's main concepts.
Publisher: John Wiley & Sons
ISBN: 0470182954
Category : Mathematics
Languages : en
Pages : 487
Book Description
A complete and accessible introduction to the real-world applications of approximate dynamic programming With the growing levels of sophistication in modern-day operations, it is vital for practitioners to understand how to approach, model, and solve complex industrial problems. Approximate Dynamic Programming is a result of the author's decades of experience working in large industrial settings to develop practical and high-quality solutions to problems that involve making decisions in the presence of uncertainty. This groundbreaking book uniquely integrates four distinct disciplines—Markov design processes, mathematical programming, simulation, and statistics—to demonstrate how to successfully model and solve a wide range of real-life problems using the techniques of approximate dynamic programming (ADP). The reader is introduced to the three curses of dimensionality that impact complex problems and is also shown how the post-decision state variable allows for the use of classical algorithmic strategies from operations research to treat complex stochastic optimization problems. Designed as an introduction and assuming no prior training in dynamic programming of any form, Approximate Dynamic Programming contains dozens of algorithms that are intended to serve as a starting point in the design of practical solutions for real problems. The book provides detailed coverage of implementation challenges including: modeling complex sequential decision processes under uncertainty, identifying robust policies, designing and estimating value function approximations, choosing effective stepsize rules, and resolving convergence issues. With a focus on modeling and algorithms in conjunction with the language of mainstream operations research, artificial intelligence, and control theory, Approximate Dynamic Programming: Models complex, high-dimensional problems in a natural and practical way, which draws on years of industrial projects Introduces and emphasizes the power of estimating a value function around the post-decision state, allowing solution algorithms to be broken down into three fundamental steps: classical simulation, classical optimization, and classical statistics Presents a thorough discussion of recursive estimation, including fundamental theory and a number of issues that arise in the development of practical algorithms Offers a variety of methods for approximating dynamic programs that have appeared in previous literature, but that have never been presented in the coherent format of a book Motivated by examples from modern-day operations research, Approximate Dynamic Programming is an accessible introduction to dynamic modeling and is also a valuable guide for the development of high-quality solutions to problems that exist in operations research and engineering. The clear and precise presentation of the material makes this an appropriate text for advanced undergraduate and beginning graduate courses, while also serving as a reference for researchers and practitioners. A companion Web site is available for readers, which includes additional exercises, solutions to exercises, and data sets to reinforce the book's main concepts.
Stochastic Dynamic Programming and the Control of Queueing Systems
Author: Linn I. Sennott
Publisher: John Wiley & Sons
ISBN: 0470317876
Category : Mathematics
Languages : en
Pages : 355
Book Description
A path-breaking account of Markov decision processes-theory and computation This book's clear presentation of theory, numerous chapter-end problems, and development of a unified method for the computation of optimal policies in both discrete and continuous time make it an excellent course text for graduate students and advanced undergraduates. Its comprehensive coverage of important recent advances in stochastic dynamic programming makes it a valuable working resource for operations research professionals, management scientists, engineers, and others. Stochastic Dynamic Programming and the Control of Queueing Systems presents the theory of optimization under the finite horizon, infinite horizon discounted, and average cost criteria. It then shows how optimal rules of operation (policies) for each criterion may be numerically determined. A great wealth of examples from the application area of the control of queueing systems is presented. Nine numerical programs for the computation of optimal policies are fully explicated. The Pascal source code for the programs is available for viewing and downloading on the Wiley Web site at www.wiley.com/products/subject/mathematics. The site contains a link to the author's own Web site and is also a place where readers may discuss developments on the programs or other aspects of the material. The source files are also available via ftp at ftp://ftp.wiley.com/public/sci_tech_med/stochastic Stochastic Dynamic Programming and the Control of Queueing Systems features: * Path-breaking advances in Markov decision process techniques, brought together for the first time in book form * A theorem/proof format (proofs may be omitted without loss of continuity) * Development of a unified method for the computation of optimal rules of system operation * Numerous examples drawn mainly from the control of queueing systems * Detailed discussions of nine numerical programs * Helpful chapter-end problems * Appendices with complete treatment of background material
Publisher: John Wiley & Sons
ISBN: 0470317876
Category : Mathematics
Languages : en
Pages : 355
Book Description
A path-breaking account of Markov decision processes-theory and computation This book's clear presentation of theory, numerous chapter-end problems, and development of a unified method for the computation of optimal policies in both discrete and continuous time make it an excellent course text for graduate students and advanced undergraduates. Its comprehensive coverage of important recent advances in stochastic dynamic programming makes it a valuable working resource for operations research professionals, management scientists, engineers, and others. Stochastic Dynamic Programming and the Control of Queueing Systems presents the theory of optimization under the finite horizon, infinite horizon discounted, and average cost criteria. It then shows how optimal rules of operation (policies) for each criterion may be numerically determined. A great wealth of examples from the application area of the control of queueing systems is presented. Nine numerical programs for the computation of optimal policies are fully explicated. The Pascal source code for the programs is available for viewing and downloading on the Wiley Web site at www.wiley.com/products/subject/mathematics. The site contains a link to the author's own Web site and is also a place where readers may discuss developments on the programs or other aspects of the material. The source files are also available via ftp at ftp://ftp.wiley.com/public/sci_tech_med/stochastic Stochastic Dynamic Programming and the Control of Queueing Systems features: * Path-breaking advances in Markov decision process techniques, brought together for the first time in book form * A theorem/proof format (proofs may be omitted without loss of continuity) * Development of a unified method for the computation of optimal rules of system operation * Numerous examples drawn mainly from the control of queueing systems * Detailed discussions of nine numerical programs * Helpful chapter-end problems * Appendices with complete treatment of background material
Dynamic Management Decision And Stochastic Control Processes
Author: Toshio Odanaka
Publisher: World Scientific
ISBN: 9814507121
Category : Technology & Engineering
Languages : en
Pages : 236
Book Description
This book treats stochastic control theory and its applications in management. The main numerical techniques necessary for such applications are presented. Several advanced topics leading to optimal processes are dismissed. The book also considers the theory of some stochastic control processes and several applications to illustrate the ideas.
Publisher: World Scientific
ISBN: 9814507121
Category : Technology & Engineering
Languages : en
Pages : 236
Book Description
This book treats stochastic control theory and its applications in management. The main numerical techniques necessary for such applications are presented. Several advanced topics leading to optimal processes are dismissed. The book also considers the theory of some stochastic control processes and several applications to illustrate the ideas.
Advanced Modelling and Innovations in Water Resources Engineering
Author: Chintalacheruvu Madhusudana Rao
Publisher: Springer Nature
ISBN: 9811646295
Category : Science
Languages : en
Pages : 772
Book Description
This book presents select proceedings of the national conference on Advanced Modelling and Innovations in Water Resources Engineering (AMIWRE 2021) and examines numerous advancements in the field of water resources engineering and management towards sustainable development of environment. The topics covered includes river basin planning and development, reservoir planning and management, integrated water management, reservoir sedimentation, soil erosion and sedimentation, agricultural technologies for climate change mitigation, uncertainty analysis in hydrology, water distribution networks, floods and droughts management, water quality modelling, environmental modelling, environmental impact assessment, urban water management, open channel hydraulics, hydraulic structures, groundwater hydraulics, groundwater flow and contaminant transport modelling, computational fluid dynamics, ocean engineering, HEC-RAC, SWAT, MIKE, MODFLOW models applications, numerical analysis in water resources engineering, climate change impacts on hydrology, optimization techniques in water resources, soft computing techniques and applications in water resources and remote sensing / geospatial techniques in water resources. This book will be beneficial for water sectors development mainly agricultural production, reservoir operations, improvement of water quality, flood and drought controls, designing hydraulic structures and geospatial analysis. This book will be a valuable reference for faculties, research scholars, students, design engineers, industrialists, R & D personnel and practitioners working in water resources engineering and its related fields.
Publisher: Springer Nature
ISBN: 9811646295
Category : Science
Languages : en
Pages : 772
Book Description
This book presents select proceedings of the national conference on Advanced Modelling and Innovations in Water Resources Engineering (AMIWRE 2021) and examines numerous advancements in the field of water resources engineering and management towards sustainable development of environment. The topics covered includes river basin planning and development, reservoir planning and management, integrated water management, reservoir sedimentation, soil erosion and sedimentation, agricultural technologies for climate change mitigation, uncertainty analysis in hydrology, water distribution networks, floods and droughts management, water quality modelling, environmental modelling, environmental impact assessment, urban water management, open channel hydraulics, hydraulic structures, groundwater hydraulics, groundwater flow and contaminant transport modelling, computational fluid dynamics, ocean engineering, HEC-RAC, SWAT, MIKE, MODFLOW models applications, numerical analysis in water resources engineering, climate change impacts on hydrology, optimization techniques in water resources, soft computing techniques and applications in water resources and remote sensing / geospatial techniques in water resources. This book will be beneficial for water sectors development mainly agricultural production, reservoir operations, improvement of water quality, flood and drought controls, designing hydraulic structures and geospatial analysis. This book will be a valuable reference for faculties, research scholars, students, design engineers, industrialists, R & D personnel and practitioners working in water resources engineering and its related fields.
Markov Decision Processes
Author: Martin L. Puterman
Publisher: John Wiley & Sons
ISBN: 1118625870
Category : Mathematics
Languages : en
Pages : 544
Book Description
The Wiley-Interscience Paperback Series consists of selected books that have been made more accessible to consumers in an effort to increase global appeal and general circulation. With these new unabridged softcover volumes, Wiley hopes to extend the lives of these works by making them available to future generations of statisticians, mathematicians, and scientists. "This text is unique in bringing together so many results hitherto found only in part in other texts and papers. . . . The text is fairly self-contained, inclusive of some basic mathematical results needed, and provides a rich diet of examples, applications, and exercises. The bibliographical material at the end of each chapter is excellent, not only from a historical perspective, but because it is valuable for researchers in acquiring a good perspective of the MDP research potential." —Zentralblatt fur Mathematik ". . . it is of great value to advanced-level students, researchers, and professional practitioners of this field to have now a complete volume (with more than 600 pages) devoted to this topic. . . . Markov Decision Processes: Discrete Stochastic Dynamic Programming represents an up-to-date, unified, and rigorous treatment of theoretical and computational aspects of discrete-time Markov decision processes." —Journal of the American Statistical Association
Publisher: John Wiley & Sons
ISBN: 1118625870
Category : Mathematics
Languages : en
Pages : 544
Book Description
The Wiley-Interscience Paperback Series consists of selected books that have been made more accessible to consumers in an effort to increase global appeal and general circulation. With these new unabridged softcover volumes, Wiley hopes to extend the lives of these works by making them available to future generations of statisticians, mathematicians, and scientists. "This text is unique in bringing together so many results hitherto found only in part in other texts and papers. . . . The text is fairly self-contained, inclusive of some basic mathematical results needed, and provides a rich diet of examples, applications, and exercises. The bibliographical material at the end of each chapter is excellent, not only from a historical perspective, but because it is valuable for researchers in acquiring a good perspective of the MDP research potential." —Zentralblatt fur Mathematik ". . . it is of great value to advanced-level students, researchers, and professional practitioners of this field to have now a complete volume (with more than 600 pages) devoted to this topic. . . . Markov Decision Processes: Discrete Stochastic Dynamic Programming represents an up-to-date, unified, and rigorous treatment of theoretical and computational aspects of discrete-time Markov decision processes." —Journal of the American Statistical Association
Dynamic Optimization, Second Edition
Author: Morton I. Kamien
Publisher: Courier Corporation
ISBN: 0486310280
Category : Mathematics
Languages : en
Pages : 402
Book Description
Since its initial publication, this text has defined courses in dynamic optimization taught to economics and management science students. The two-part treatment covers the calculus of variations and optimal control. 1998 edition.
Publisher: Courier Corporation
ISBN: 0486310280
Category : Mathematics
Languages : en
Pages : 402
Book Description
Since its initial publication, this text has defined courses in dynamic optimization taught to economics and management science students. The two-part treatment covers the calculus of variations and optimal control. 1998 edition.
Advances in Multiple Objective and Goal Programming
Author: Rafael Caballero
Publisher: Springer Science & Business Media
ISBN: 3642468543
Category : Business & Economics
Languages : en
Pages : 396
Book Description
Within the field of multiple criteria decision making, this volume covers the latest advances in multiple objective and goal programming as presented at the 2nd International Conference on Multi-Objective Programming and Goal Programming, Torremolinos, Spain, May 16 - 18, 1996. The book is an undispensable source of the latest research results, presented by the leading experts of the field.
Publisher: Springer Science & Business Media
ISBN: 3642468543
Category : Business & Economics
Languages : en
Pages : 396
Book Description
Within the field of multiple criteria decision making, this volume covers the latest advances in multiple objective and goal programming as presented at the 2nd International Conference on Multi-Objective Programming and Goal Programming, Torremolinos, Spain, May 16 - 18, 1996. The book is an undispensable source of the latest research results, presented by the leading experts of the field.
Handbook of Research on Advanced Data Mining Techniques and Applications for Business Intelligence
Author: Trivedi, Shrawan Kumar
Publisher: IGI Global
ISBN: 1522520325
Category : Computers
Languages : en
Pages : 465
Book Description
The development of business intelligence has enhanced the visualization of data to inform and facilitate business management and strategizing. By implementing effective data-driven techniques, this allows for advance reporting tools to cater to company-specific issues and challenges. The Handbook of Research on Advanced Data Mining Techniques and Applications for Business Intelligence is a key resource on the latest advancements in business applications and the use of mining software solutions to achieve optimal decision-making and risk management results. Highlighting innovative studies on data warehousing, business activity monitoring, and text mining, this publication is an ideal reference source for research scholars, management faculty, and practitioners.
Publisher: IGI Global
ISBN: 1522520325
Category : Computers
Languages : en
Pages : 465
Book Description
The development of business intelligence has enhanced the visualization of data to inform and facilitate business management and strategizing. By implementing effective data-driven techniques, this allows for advance reporting tools to cater to company-specific issues and challenges. The Handbook of Research on Advanced Data Mining Techniques and Applications for Business Intelligence is a key resource on the latest advancements in business applications and the use of mining software solutions to achieve optimal decision-making and risk management results. Highlighting innovative studies on data warehousing, business activity monitoring, and text mining, this publication is an ideal reference source for research scholars, management faculty, and practitioners.