Convex Optimization with Computational Errors

Convex Optimization with Computational Errors PDF Author: Alexander J. Zaslavski
Publisher: Springer Nature
ISBN: 3030378225
Category : Mathematics
Languages : en
Pages : 364

Get Book Here

Book Description
The book is devoted to the study of approximate solutions of optimization problems in the presence of computational errors. It contains a number of results on the convergence behavior of algorithms in a Hilbert space, which are known as important tools for solving optimization problems. The research presented in the book is the continuation and the further development of the author's (c) 2016 book Numerical Optimization with Computational Errors, Springer 2016. Both books study the algorithms taking into account computational errors which are always present in practice. The main goal is, for a known computational error, to find out what an approximate solution can be obtained and how many iterates one needs for this. The main difference between this new book and the 2016 book is that in this present book the discussion takes into consideration the fact that for every algorithm, its iteration consists of several steps and that computational errors for different steps are generally, different. This fact, which was not taken into account in the previous book, is indeed important in practice. For example, the subgradient projection algorithm consists of two steps. The first step is a calculation of a subgradient of the objective function while in the second one we calculate a projection on the feasible set. In each of these two steps there is a computational error and these two computational errors are different in general. It may happen that the feasible set is simple and the objective function is complicated. As a result, the computational error, made when one calculates the projection, is essentially smaller than the computational error of the calculation of the subgradient. Clearly, an opposite case is possible too. Another feature of this book is a study of a number of important algorithms which appeared recently in the literature and which are not discussed in the previous book. This monograph contains 12 chapters. Chapter 1 is an introduction. In Chapter 2 we study the subgradient projection algorithm for minimization of convex and nonsmooth functions. We generalize the results of [NOCE] and establish results which has no prototype in [NOCE]. In Chapter 3 we analyze the mirror descent algorithm for minimization of convex and nonsmooth functions, under the presence of computational errors. For this algorithm each iteration consists of two steps. The first step is a calculation of a subgradient of the objective function while in the second one we solve an auxiliary minimization problem on the set of feasible points. In each of these two steps there is a computational error. We generalize the results of [NOCE] and establish results which has no prototype in [NOCE]. In Chapter 4 we analyze the projected gradient algorithm with a smooth objective function under the presence of computational errors. In Chapter 5 we consider an algorithm, which is an extension of the projection gradient algorithm used for solving linear inverse problems arising in signal/image processing. In Chapter 6 we study continuous subgradient method and continuous subgradient projection algorithm for minimization of convex nonsmooth functions and for computing the saddle points of convex-concave functions, under the presence of computational errors. All the results of this chapter has no prototype in [NOCE]. In Chapters 7-12 we analyze several algorithms under the presence of computational errors which were not considered in [NOCE]. Again, each step of an iteration has a computational errors and we take into account that these errors are, in general, different. An optimization problems with a composite objective function is studied in Chapter 7. A zero-sum game with two-players is considered in Chapter 8. A predicted decrease approximation-based method is used in Chapter 9 for constrained convex optimization. Chapter 10 is devoted to minimization of quasiconvex functions. Minimization of sharp weakly convex functions is discussed in Chapter 11. Chapter 12 is devoted to a generalized projected subgradient method for minimization of a convex function over a set which is not necessarily convex. The book is of interest for researchers and engineers working in optimization. It also can be useful in preparation courses for graduate students. The main feature of the book which appeals specifically to this audience is the study of the influence of computational errors for several important optimization algorithms. The book is of interest for experts in applications of optimization to engineering and economics.

Convex Optimization with Computational Errors

Convex Optimization with Computational Errors PDF Author: Alexander J. Zaslavski
Publisher: Springer Nature
ISBN: 3030378225
Category : Mathematics
Languages : en
Pages : 364

Get Book Here

Book Description
The book is devoted to the study of approximate solutions of optimization problems in the presence of computational errors. It contains a number of results on the convergence behavior of algorithms in a Hilbert space, which are known as important tools for solving optimization problems. The research presented in the book is the continuation and the further development of the author's (c) 2016 book Numerical Optimization with Computational Errors, Springer 2016. Both books study the algorithms taking into account computational errors which are always present in practice. The main goal is, for a known computational error, to find out what an approximate solution can be obtained and how many iterates one needs for this. The main difference between this new book and the 2016 book is that in this present book the discussion takes into consideration the fact that for every algorithm, its iteration consists of several steps and that computational errors for different steps are generally, different. This fact, which was not taken into account in the previous book, is indeed important in practice. For example, the subgradient projection algorithm consists of two steps. The first step is a calculation of a subgradient of the objective function while in the second one we calculate a projection on the feasible set. In each of these two steps there is a computational error and these two computational errors are different in general. It may happen that the feasible set is simple and the objective function is complicated. As a result, the computational error, made when one calculates the projection, is essentially smaller than the computational error of the calculation of the subgradient. Clearly, an opposite case is possible too. Another feature of this book is a study of a number of important algorithms which appeared recently in the literature and which are not discussed in the previous book. This monograph contains 12 chapters. Chapter 1 is an introduction. In Chapter 2 we study the subgradient projection algorithm for minimization of convex and nonsmooth functions. We generalize the results of [NOCE] and establish results which has no prototype in [NOCE]. In Chapter 3 we analyze the mirror descent algorithm for minimization of convex and nonsmooth functions, under the presence of computational errors. For this algorithm each iteration consists of two steps. The first step is a calculation of a subgradient of the objective function while in the second one we solve an auxiliary minimization problem on the set of feasible points. In each of these two steps there is a computational error. We generalize the results of [NOCE] and establish results which has no prototype in [NOCE]. In Chapter 4 we analyze the projected gradient algorithm with a smooth objective function under the presence of computational errors. In Chapter 5 we consider an algorithm, which is an extension of the projection gradient algorithm used for solving linear inverse problems arising in signal/image processing. In Chapter 6 we study continuous subgradient method and continuous subgradient projection algorithm for minimization of convex nonsmooth functions and for computing the saddle points of convex-concave functions, under the presence of computational errors. All the results of this chapter has no prototype in [NOCE]. In Chapters 7-12 we analyze several algorithms under the presence of computational errors which were not considered in [NOCE]. Again, each step of an iteration has a computational errors and we take into account that these errors are, in general, different. An optimization problems with a composite objective function is studied in Chapter 7. A zero-sum game with two-players is considered in Chapter 8. A predicted decrease approximation-based method is used in Chapter 9 for constrained convex optimization. Chapter 10 is devoted to minimization of quasiconvex functions. Minimization of sharp weakly convex functions is discussed in Chapter 11. Chapter 12 is devoted to a generalized projected subgradient method for minimization of a convex function over a set which is not necessarily convex. The book is of interest for researchers and engineers working in optimization. It also can be useful in preparation courses for graduate students. The main feature of the book which appeals specifically to this audience is the study of the influence of computational errors for several important optimization algorithms. The book is of interest for experts in applications of optimization to engineering and economics.

Numerical Optimization

Numerical Optimization PDF Author: Jorge Nocedal
Publisher: Springer Science & Business Media
ISBN: 0387227423
Category : Mathematics
Languages : en
Pages : 651

Get Book Here

Book Description
The new edition of this book presents a comprehensive and up-to-date description of the most effective methods in continuous optimization. It responds to the growing interest in optimization in engineering, science, and business by focusing on methods best suited to practical problems. This edition has been thoroughly updated throughout. There are new chapters on nonlinear interior methods and derivative-free methods for optimization, both of which are widely used in practice and are the focus of much current research. Because of the emphasis on practical methods, as well as the extensive illustrations and exercises, the book is accessible to a wide audience.

Numerical Methods and Optimization

Numerical Methods and Optimization PDF Author: Sergiy Butenko
Publisher: CRC Press
ISBN: 1466577789
Category : Business & Economics
Languages : en
Pages : 408

Get Book Here

Book Description
For students in industrial and systems engineering (ISE) and operations research (OR) to understand optimization at an advanced level, they must first grasp the analysis of algorithms, computational complexity, and other concepts and modern developments in numerical methods. Satisfying this prerequisite, Numerical Methods and Optimization: An Intro

Numerical Optimization

Numerical Optimization PDF Author: Jorge Nocedal
Publisher: Springer Science & Business Media
ISBN: 0387400656
Category : Mathematics
Languages : en
Pages : 686

Get Book Here

Book Description
Optimization is an important tool used in decision science and for the analysis of physical systems used in engineering. One can trace its roots to the Calculus of Variations and the work of Euler and Lagrange. This natural and reasonable approach to mathematical programming covers numerical methods for finite-dimensional optimization problems. It begins with very simple ideas progressing through more complicated concepts, concentrating on methods for both unconstrained and constrained optimization.

Computational Methods for Inverse Problems

Computational Methods for Inverse Problems PDF Author: Curtis R. Vogel
Publisher: SIAM
ISBN: 0898717574
Category : Mathematics
Languages : en
Pages : 195

Get Book Here

Book Description
Provides a basic understanding of both the underlying mathematics and the computational methods used to solve inverse problems.

Numerical Methods for Unconstrained Optimization and Nonlinear Equations

Numerical Methods for Unconstrained Optimization and Nonlinear Equations PDF Author: J. E. Dennis, Jr.
Publisher: SIAM
ISBN: 9781611971200
Category : Mathematics
Languages : en
Pages : 394

Get Book Here

Book Description
This book has become the standard for a complete, state-of-the-art description of the methods for unconstrained optimization and systems of nonlinear equations. Originally published in 1983, it provides information needed to understand both the theory and the practice of these methods and provides pseudocode for the problems. The algorithms covered are all based on Newton's method or "quasi-Newton" methods, and the heart of the book is the material on computational methods for multidimensional unconstrained optimization and nonlinear equation problems. The republication of this book by SIAM is driven by a continuing demand for specific and sound advice on how to solve real problems. The level of presentation is consistent throughout, with a good mix of examples and theory, making it a valuable text at both the graduate and undergraduate level. It has been praised as excellent for courses with approximately the same name as the book title and would also be useful as a supplemental text for a nonlinear programming or a numerical analysis course. Many exercises are provided to illustrate and develop the ideas in the text. A large appendix provides a mechanism for class projects and a reference for readers who want the details of the algorithms. Practitioners may use this book for self-study and reference. For complete understanding, readers should have a background in calculus and linear algebra. The book does contain background material in multivariable calculus and numerical linear algebra.

Numerical Methods and Optimization in Finance

Numerical Methods and Optimization in Finance PDF Author: Manfred Gilli
Publisher: Academic Press
ISBN: 0128150653
Category : Business & Economics
Languages : en
Pages : 638

Get Book Here

Book Description
Computationally-intensive tools play an increasingly important role in financial decisions. Many financial problems-ranging from asset allocation to risk management and from option pricing to model calibration-can be efficiently handled using modern computational techniques. Numerical Methods and Optimization in Finance presents such computational techniques, with an emphasis on simulation and optimization, particularly so-called heuristics. This book treats quantitative analysis as an essentially computational discipline in which applications are put into software form and tested empirically. This revised edition includes two new chapters, a self-contained tutorial on implementing and using heuristics, and an explanation of software used for testing portfolio-selection models. Postgraduate students, researchers in programs on quantitative and computational finance, and practitioners in banks and other financial companies can benefit from this second edition of Numerical Methods and Optimization in Finance.

The Projected Subgradient Algorithm in Convex Optimization

The Projected Subgradient Algorithm in Convex Optimization PDF Author: Alexander J. Zaslavski
Publisher: Springer Nature
ISBN: 3030603008
Category : Mathematics
Languages : en
Pages : 148

Get Book Here

Book Description
This focused monograph presents a study of subgradient algorithms for constrained minimization problems in a Hilbert space. The book is of interest for experts in applications of optimization to engineering and economics. The goal is to obtain a good approximate solution of the problem in the presence of computational errors. The discussion takes into consideration the fact that for every algorithm its iteration consists of several steps and that computational errors for different steps are different, in general. The book is especially useful for the reader because it contains solutions to a number of difficult and interesting problems in the numerical optimization. The subgradient projection algorithm is one of the most important tools in optimization theory and its applications. An optimization problem is described by an objective function and a set of feasible points. For this algorithm each iteration consists of two steps. The first step requires a calculation of a subgradient of the objective function; the second requires a calculation of a projection on the feasible set. The computational errors in each of these two steps are different. This book shows that the algorithm discussed, generates a good approximate solution, if all the computational errors are bounded from above by a small positive constant. Moreover, if computational errors for the two steps of the algorithm are known, one discovers an approximate solution and how many iterations one needs for this. In addition to their mathematical interest, the generalizations considered in this book have a significant practical meaning.

Numerical Algorithms

Numerical Algorithms PDF Author: Justin Solomon
Publisher: CRC Press
ISBN: 1482251892
Category : Computers
Languages : en
Pages : 400

Get Book Here

Book Description
Numerical Algorithms: Methods for Computer Vision, Machine Learning, and Graphics presents a new approach to numerical analysis for modern computer scientists. Using examples from a broad base of computational tasks, including data processing, computational photography, and animation, the textbook introduces numerical modeling and algorithmic desig

Mathematical Theory of Optimization

Mathematical Theory of Optimization PDF Author: Ding-Zhu Du
Publisher: Springer Science & Business Media
ISBN: 1475757956
Category : Mathematics
Languages : en
Pages : 277

Get Book Here

Book Description
This book provides an introduction to the mathematical theory of optimization. It emphasizes the convergence theory of nonlinear optimization algorithms and applications of nonlinear optimization to combinatorial optimization. Mathematical Theory of Optimization includes recent developments in global convergence, the Powell conjecture, semidefinite programming, and relaxation techniques for designs of approximation solutions of combinatorial optimization problems.