Gradient Methods for Large-scale Nonlinear Optimization

Gradient Methods for Large-scale Nonlinear Optimization PDF Author: Hongchao Zhang
Publisher:
ISBN:
Category :
Languages : en
Pages :

Get Book Here

Book Description
Finally, we propose a class of self-adaptive proximal point methods suitable for degenerate optimization problems where multiple minimizers may exist, or where the Hessian may be singular at a local minimizer. Two different acceptance criteria for an approximate solution to the proximal problem is analyzed and the convergence rate are analogous to those of exact iterates. The second part of this dissertation discusses using gradient methods to solve large-scale box constrained optimization. We first discuss the gradient projection methods. Then, an active set algorithm (ASA) for box constrained optimization is developed. The algorithm consists of a nonmonotone gradient projection step, an unconstrained optimization step, and a set of rules for branching between the two steps. Global convergence to a stationary point is established. Under the strong second-order sufficient optimality condition, without assuming strict complementarity, the algorithm eventually reduces to unconstrained optimization without restarts. For strongly convex quadratic box constrained optimization, ASA is shown to have finite convergence when a conjugate gradient method is used in the unconstrained optimization step. A specific implementation of ASA is given, which exploits the cyclic Barzilai-Borwein algorithm for the gradient projection step and CG_DESCENT for unconstrained optimization. Numerical experiments using the box constrained problems in the CUTEr and MINPACK test problem libraries show that this new algorithm outperforms benchmark softwares such as GENCAN, L-BFGS-B, and TRON.

Gradient Methods for Large-scale Nonlinear Optimization

Gradient Methods for Large-scale Nonlinear Optimization PDF Author: Hongchao Zhang
Publisher:
ISBN:
Category :
Languages : en
Pages :

Get Book Here

Book Description
Finally, we propose a class of self-adaptive proximal point methods suitable for degenerate optimization problems where multiple minimizers may exist, or where the Hessian may be singular at a local minimizer. Two different acceptance criteria for an approximate solution to the proximal problem is analyzed and the convergence rate are analogous to those of exact iterates. The second part of this dissertation discusses using gradient methods to solve large-scale box constrained optimization. We first discuss the gradient projection methods. Then, an active set algorithm (ASA) for box constrained optimization is developed. The algorithm consists of a nonmonotone gradient projection step, an unconstrained optimization step, and a set of rules for branching between the two steps. Global convergence to a stationary point is established. Under the strong second-order sufficient optimality condition, without assuming strict complementarity, the algorithm eventually reduces to unconstrained optimization without restarts. For strongly convex quadratic box constrained optimization, ASA is shown to have finite convergence when a conjugate gradient method is used in the unconstrained optimization step. A specific implementation of ASA is given, which exploits the cyclic Barzilai-Borwein algorithm for the gradient projection step and CG_DESCENT for unconstrained optimization. Numerical experiments using the box constrained problems in the CUTEr and MINPACK test problem libraries show that this new algorithm outperforms benchmark softwares such as GENCAN, L-BFGS-B, and TRON.

Large-Scale Nonlinear Optimization

Large-Scale Nonlinear Optimization PDF Author: Gianni Pillo
Publisher: Springer Science & Business Media
ISBN: 0387300651
Category : Mathematics
Languages : en
Pages : 297

Get Book Here

Book Description
This book reviews and discusses recent advances in the development of methods and algorithms for nonlinear optimization and its applications, focusing on the large-dimensional case, the current forefront of much research. Individual chapters, contributed by eminent authorities, provide an up-to-date overview of the field from different and complementary standpoints, including theoretical analysis, algorithmic development, implementation issues and applications.

Nonlinear Conjugate Gradient Methods for Unconstrained Optimization

Nonlinear Conjugate Gradient Methods for Unconstrained Optimization PDF Author: Neculai Andrei
Publisher: Springer Nature
ISBN: 3030429504
Category : Mathematics
Languages : en
Pages : 515

Get Book Here

Book Description
Two approaches are known for solving large-scale unconstrained optimization problems—the limited-memory quasi-Newton method (truncated Newton method) and the conjugate gradient method. This is the first book to detail conjugate gradient methods, showing their properties and convergence characteristics as well as their performance in solving large-scale unconstrained optimization problems and applications. Comparisons to the limited-memory and truncated Newton methods are also discussed. Topics studied in detail include: linear conjugate gradient methods, standard conjugate gradient methods, acceleration of conjugate gradient methods, hybrid, modifications of the standard scheme, memoryless BFGS preconditioned, and three-term. Other conjugate gradient methods with clustering the eigenvalues or with the minimization of the condition number of the iteration matrix, are also treated. For each method, the convergence analysis, the computational performances and the comparisons versus other conjugate gradient methods are given. The theory behind the conjugate gradient algorithms presented as a methodology is developed with a clear, rigorous, and friendly exposition; the reader will gain an understanding of their properties and their convergence and will learn to develop and prove the convergence of his/her own methods. Numerous numerical studies are supplied with comparisons and comments on the behavior of conjugate gradient algorithms for solving a collection of 800 unconstrained optimization problems of different structures and complexities with the number of variables in the range [1000,10000]. The book is addressed to all those interested in developing and using new advanced techniques for solving unconstrained optimization complex problems. Mathematical programming researchers, theoreticians and practitioners in operations research, practitioners in engineering and industry researchers, as well as graduate students in mathematics, Ph.D. and master students in mathematical programming, will find plenty of information and practical applications for solving large-scale unconstrained optimization problems and applications by conjugate gradient methods.

Conjugate gradient methods for large scale nonlinear optimization

Conjugate gradient methods for large scale nonlinear optimization PDF Author: Stanford University. Systems Optimization Laboratory
Publisher:
ISBN:
Category : Algorithms
Languages : en
Pages : 64

Get Book Here

Book Description
In this paper we discuss several recent conjugate-gradient type methods for solving large-scale nonlinear optimization problems. We demonstrate how the performance of these methods can be significantly improved by careful implementation. A method based upon iterative preconditioning will be suggested which performs reasonably efficiently on a wide variety of significant test problems. Our results indicate that nonlinear conjugate-gradient methods behave in a similar way to conjugate-gradient methods for the solution of systems of linear equations. These methods work best on problems whose Hessian matrices have sets of clustered eigenvalues. On more general problems, however, even the best method may require a prohibitively large number of iterations. We present numerical evidence that indicates that the use of theoretical analysis to predict the performance of algorithms on general problems is not straightforward. (Author).

Nonlinear Conjugate Gradient Methods for Unconstrained Optimization

Nonlinear Conjugate Gradient Methods for Unconstrained Optimization PDF Author: Neculai Andrei
Publisher: Springer
ISBN: 9783030429492
Category : Mathematics
Languages : en
Pages : 486

Get Book Here

Book Description
Two approaches are known for solving large-scale unconstrained optimization problems—the limited-memory quasi-Newton method (truncated Newton method) and the conjugate gradient method. This is the first book to detail conjugate gradient methods, showing their properties and convergence characteristics as well as their performance in solving large-scale unconstrained optimization problems and applications. Comparisons to the limited-memory and truncated Newton methods are also discussed. Topics studied in detail include: linear conjugate gradient methods, standard conjugate gradient methods, acceleration of conjugate gradient methods, hybrid, modifications of the standard scheme, memoryless BFGS preconditioned, and three-term. Other conjugate gradient methods with clustering the eigenvalues or with the minimization of the condition number of the iteration matrix, are also treated. For each method, the convergence analysis, the computational performances and the comparisons versus other conjugate gradient methods are given. The theory behind the conjugate gradient algorithms presented as a methodology is developed with a clear, rigorous, and friendly exposition; the reader will gain an understanding of their properties and their convergence and will learn to develop and prove the convergence of his/her own methods. Numerous numerical studies are supplied with comparisons and comments on the behavior of conjugate gradient algorithms for solving a collection of 800 unconstrained optimization problems of different structures and complexities with the number of variables in the range [1000,10000]. The book is addressed to all those interested in developing and using new advanced techniques for solving unconstrained optimization complex problems. Mathematical programming researchers, theoreticians and practitioners in operations research, practitioners in engineering and industry researchers, as well as graduate students in mathematics, Ph.D. and master students in mathematical programming, will find plenty of information and practical applications for solving large-scale unconstrained optimization problems and applications by conjugate gradient methods.

Large-scale Numerical Optimization

Large-scale Numerical Optimization PDF Author: Thomas Frederick Coleman
Publisher: SIAM
ISBN: 9780898712681
Category : Mathematics
Languages : en
Pages : 278

Get Book Here

Book Description
Papers from a workshop held at Cornell University, Oct. 1989, and sponsored by Cornell's Mathematical Sciences Institute. Annotation copyright Book News, Inc. Portland, Or.

Encyclopedia of Optimization

Encyclopedia of Optimization PDF Author: Christodoulos A. Floudas
Publisher: Springer Science & Business Media
ISBN: 0387747583
Category : Mathematics
Languages : en
Pages : 4646

Get Book Here

Book Description
The goal of the Encyclopedia of Optimization is to introduce the reader to a complete set of topics that show the spectrum of research, the richness of ideas, and the breadth of applications that has come from this field. The second edition builds on the success of the former edition with more than 150 completely new entries, designed to ensure that the reference addresses recent areas where optimization theories and techniques have advanced. Particularly heavy attention resulted in health science and transportation, with entries such as "Algorithms for Genomics", "Optimization and Radiotherapy Treatment Design", and "Crew Scheduling".

Lancelot

Lancelot PDF Author: A.R. Conn
Publisher: Springer Science & Business Media
ISBN: 3662122111
Category : Computers
Languages : en
Pages : 347

Get Book Here

Book Description
LANCELOT is a software package for solving large-scale nonlinear optimization problems. This book is our attempt to provide a coherent overview of the package and its use. This includes details of how one might present examples to the package, how the algorithm tries to solve these examples and various technical issues which may be useful to implementors of the software. We hope this book will be of use to both researchers and practitioners in nonlinear programming. Although the book is primarily concerned with a specific optimization package, the issues discussed have much wider implications for the design and im plementation of large-scale optimization algorithms. In particular, the book contains a proposal for a standard input format for large-scale optimization problems. This proposal is at the heart of the interface between a user's problem and the LANCE LOT optimization package. Furthermore, a large collection of over five hundred test ex amples has already been written in this format and will shortly be available to those who wish to use them. We would like to thank the many people and organizations who supported us in our enterprise. We first acknowledge the support provided by our employers, namely the the Facultes Universitaires Notre-Dame de la Paix (Namur, Belgium), Harwell Laboratory (UK), IBM Corporation (USA), Rutherford Appleton Laboratory (UK) and the University of Waterloo (Canada). We are grateful for the support we obtained from NSERC (Canada), NATO and AMOCO (UK).

Large-scale Nonlinear Programming Using the Generalized Reduced Gradient Method

Large-scale Nonlinear Programming Using the Generalized Reduced Gradient Method PDF Author: Gary Anthony Gabriele
Publisher:
ISBN:
Category : Nonlinear programming
Languages : en
Pages : 108

Get Book Here

Book Description


Conjugate Gradient Algorithms in Nonconvex Optimization

Conjugate Gradient Algorithms in Nonconvex Optimization PDF Author: Radoslaw Pytlak
Publisher: Springer Science & Business Media
ISBN: 354085634X
Category : Mathematics
Languages : en
Pages : 493

Get Book Here

Book Description
This book details algorithms for large-scale unconstrained and bound constrained optimization. It shows optimization techniques from a conjugate gradient algorithm perspective as well as methods of shortest residuals, which have been developed by the author.