Improved Scaled and Shifted Conjugate Gradient Methods for Large-scale Unconstrained Optimization

Improved Scaled and Shifted Conjugate Gradient Methods for Large-scale Unconstrained Optimization PDF Author: Amal Ahmed Al-Saidiyah
Publisher:
ISBN:
Category :
Languages : en
Pages : 0

Get Book Here

Book Description

Improved Scaled and Shifted Conjugate Gradient Methods for Large-scale Unconstrained Optimization

Improved Scaled and Shifted Conjugate Gradient Methods for Large-scale Unconstrained Optimization PDF Author: Amal Ahmed Al-Saidiyah
Publisher:
ISBN:
Category :
Languages : en
Pages : 0

Get Book Here

Book Description


A New Family of Conjugate Gradient Methods for Large-scale Unconstrained Optimization

A New Family of Conjugate Gradient Methods for Large-scale Unconstrained Optimization PDF Author: Ibrahim Jusoh
Publisher:
ISBN:
Category :
Languages : en
Pages :

Get Book Here

Book Description


Nonlinear Conjugate Gradient Methods for Unconstrained Optimization

Nonlinear Conjugate Gradient Methods for Unconstrained Optimization PDF Author: Neculai Andrei
Publisher: Springer
ISBN: 9783030429492
Category : Mathematics
Languages : en
Pages : 486

Get Book Here

Book Description
Two approaches are known for solving large-scale unconstrained optimization problems—the limited-memory quasi-Newton method (truncated Newton method) and the conjugate gradient method. This is the first book to detail conjugate gradient methods, showing their properties and convergence characteristics as well as their performance in solving large-scale unconstrained optimization problems and applications. Comparisons to the limited-memory and truncated Newton methods are also discussed. Topics studied in detail include: linear conjugate gradient methods, standard conjugate gradient methods, acceleration of conjugate gradient methods, hybrid, modifications of the standard scheme, memoryless BFGS preconditioned, and three-term. Other conjugate gradient methods with clustering the eigenvalues or with the minimization of the condition number of the iteration matrix, are also treated. For each method, the convergence analysis, the computational performances and the comparisons versus other conjugate gradient methods are given. The theory behind the conjugate gradient algorithms presented as a methodology is developed with a clear, rigorous, and friendly exposition; the reader will gain an understanding of their properties and their convergence and will learn to develop and prove the convergence of his/her own methods. Numerous numerical studies are supplied with comparisons and comments on the behavior of conjugate gradient algorithms for solving a collection of 800 unconstrained optimization problems of different structures and complexities with the number of variables in the range [1000,10000]. The book is addressed to all those interested in developing and using new advanced techniques for solving unconstrained optimization complex problems. Mathematical programming researchers, theoreticians and practitioners in operations research, practitioners in engineering and industry researchers, as well as graduate students in mathematics, Ph.D. and master students in mathematical programming, will find plenty of information and practical applications for solving large-scale unconstrained optimization problems and applications by conjugate gradient methods.

Preconditioned Conjugate-gradient-type Methods for Large-scale Unconstrained Optimization

Preconditioned Conjugate-gradient-type Methods for Large-scale Unconstrained Optimization PDF Author: Mary Catherine Fenelon
Publisher:
ISBN:
Category : Large scale systems
Languages : en
Pages : 288

Get Book Here

Book Description


Gradient Type Methods for Large Scale Optimization

Gradient Type Methods for Large Scale Optimization PDF Author: Mahboubeh Farid
Publisher: LAP Lambert Academic Publishing
ISBN: 9783844319682
Category :
Languages : en
Pages : 104

Get Book Here

Book Description
The focus of this book is on finding the unconstrained minimizer of a function. Specifically, we will focus on the Barzilai and Borwein (BB) method that is a famous two-point stepsize gradient method. Due to BB method s simplicity, low storage and numerical efficiency, the BB method has received a good deal of attention in the optimization community but despite all these advances, stepsize of BB method is computed by means of simple approximation of Hessian in the form of scalar multiple of identity and especially the BB method is not monotone, and it is not easy to generalize the method to general nonlinear functions. Due to the presence of these deficiencies, we introduce new gradient-type methods in the frame of BB method including a new gradient method via weak secant equation, improved Hessian approximation and scaling the diagonal updating. Our proposed methods consider approximation of Hessian in diagonal matrix. Incorporate with monotone strategies, the resulting algorithms belong to the class of monotone gradient methods with globally convergence. Numerical results suggest that for non-quadratic minimization problem, the new methods clearly outperform the BB method.

Conjugate Gradient Algorithms in Nonconvex Optimization

Conjugate Gradient Algorithms in Nonconvex Optimization PDF Author: Radoslaw Pytlak
Publisher: Springer
ISBN: 9783540856337
Category : Mathematics
Languages : en
Pages : 478

Get Book Here

Book Description
This book details algorithms for large-scale unconstrained and bound constrained optimization. It shows optimization techniques from a conjugate gradient algorithm perspective as well as methods of shortest residuals, which have been developed by the author.

Conjugate Gradient Algorithms and Finite Element Methods

Conjugate Gradient Algorithms and Finite Element Methods PDF Author: Michal Krizek
Publisher: Springer Science & Business Media
ISBN: 3642185606
Category : Science
Languages : en
Pages : 405

Get Book Here

Book Description
The position taken in this collection of pedagogically written essays is that conjugate gradient algorithms and finite element methods complement each other extremely well. Via their combinations practitioners have been able to solve complicated, direct and inverse, multidemensional problems modeled by ordinary or partial differential equations and inequalities, not necessarily linear, optimal control and optimal design being part of these problems. The aim of this book is to present both methods in the context of complicated problems modeled by linear and nonlinear partial differential equations, to provide an in-depth discussion on their implementation aspects. The authors show that conjugate gradient methods and finite element methods apply to the solution of real-life problems. They address graduate students as well as experts in scientific computing.

Conjugate gradient methods for large scale nonlinear optimization

Conjugate gradient methods for large scale nonlinear optimization PDF Author: Stanford University. Systems Optimization Laboratory
Publisher:
ISBN:
Category : Algorithms
Languages : en
Pages : 64

Get Book Here

Book Description
In this paper we discuss several recent conjugate-gradient type methods for solving large-scale nonlinear optimization problems. We demonstrate how the performance of these methods can be significantly improved by careful implementation. A method based upon iterative preconditioning will be suggested which performs reasonably efficiently on a wide variety of significant test problems. Our results indicate that nonlinear conjugate-gradient methods behave in a similar way to conjugate-gradient methods for the solution of systems of linear equations. These methods work best on problems whose Hessian matrices have sets of clustered eigenvalues. On more general problems, however, even the best method may require a prohibitively large number of iterations. We present numerical evidence that indicates that the use of theoretical analysis to predict the performance of algorithms on general problems is not straightforward. (Author).

A Family of Hybrid Conjugate Gradient Method with Restart Procedure for Unconstrained Optimizations and Image Restorations

A Family of Hybrid Conjugate Gradient Method with Restart Procedure for Unconstrained Optimizations and Image Restorations PDF Author: Xianzhen Jiang
Publisher:
ISBN:
Category :
Languages : en
Pages : 0

Get Book Here

Book Description
Conjugate gradient method is one of the most effective methods for solving large-scale optimization problems. Based on the CD conjugate parameter and an improved PRP conjugate parameter,a modified conjugate gradient method with a single-parameter can be designed. To improve its convergence property and computational efficiency, this conjugate parameter is further improved by using the hybrid technique in its denominator, and meanwhile a restart procedure is set in its search direction. Accordingly, a family of hybrid conjugate gradient method with restart procedure is established in this paper, which is sufficient descent at each iteration without depending onany selection of line search criterions. Under usual assumptions and using the weak Wolfe line search criterion to generate the steplengths, the global convergence of the proposed family is proved.Finally, choosing a specific algorithm from this family to solve large-scale unconstrained optimization problems and image restoration, all the numerical results show that the new algorithm is effective.

Nonlinear Conjugate Gradient Methods for Unconstrained Optimization

Nonlinear Conjugate Gradient Methods for Unconstrained Optimization PDF Author: Neculai Andrei
Publisher: Springer Nature
ISBN: 3030429504
Category : Mathematics
Languages : en
Pages : 515

Get Book Here

Book Description
Two approaches are known for solving large-scale unconstrained optimization problems—the limited-memory quasi-Newton method (truncated Newton method) and the conjugate gradient method. This is the first book to detail conjugate gradient methods, showing their properties and convergence characteristics as well as their performance in solving large-scale unconstrained optimization problems and applications. Comparisons to the limited-memory and truncated Newton methods are also discussed. Topics studied in detail include: linear conjugate gradient methods, standard conjugate gradient methods, acceleration of conjugate gradient methods, hybrid, modifications of the standard scheme, memoryless BFGS preconditioned, and three-term. Other conjugate gradient methods with clustering the eigenvalues or with the minimization of the condition number of the iteration matrix, are also treated. For each method, the convergence analysis, the computational performances and the comparisons versus other conjugate gradient methods are given. The theory behind the conjugate gradient algorithms presented as a methodology is developed with a clear, rigorous, and friendly exposition; the reader will gain an understanding of their properties and their convergence and will learn to develop and prove the convergence of his/her own methods. Numerous numerical studies are supplied with comparisons and comments on the behavior of conjugate gradient algorithms for solving a collection of 800 unconstrained optimization problems of different structures and complexities with the number of variables in the range [1000,10000]. The book is addressed to all those interested in developing and using new advanced techniques for solving unconstrained optimization complex problems. Mathematical programming researchers, theoreticians and practitioners in operations research, practitioners in engineering and industry researchers, as well as graduate students in mathematics, Ph.D. and master students in mathematical programming, will find plenty of information and practical applications for solving large-scale unconstrained optimization problems and applications by conjugate gradient methods.