A Frank-Wolfe/gradient Projection Method for Large Scale Optimization

A Frank-Wolfe/gradient Projection Method for Large Scale Optimization PDF Author: Institute for Defense Analyses. Supercomputing Research Center
Publisher:
ISBN:
Category : Algorithms
Languages : en
Pages : 31

Get Book Here

Book Description
We further show that in the case of a quadratic objective, a Fletcher-Reeves type conjugate gradient modification for manifold suboptimization results in the algorithm converging to a nondegenerate solution point in a finite number of iterations. Numerical results run on the Sun4 and a single processor of the Cray2 are provided for discrete optimal control problems with a large number (up to 10,000) of control variables and compared against existing results."

A Scaled Gradient Projection Method for Large Scale Optimization

A Scaled Gradient Projection Method for Large Scale Optimization PDF Author: Gerard G. L. Meyer
Publisher:
ISBN:
Category : Control theory
Languages : en
Pages : 26

Get Book Here

Book Description
Abstract: "We propose a new parametrized gradient projection algorithm for solving constrained large scale optimization problems and, in particular, discrete optimal control problems with linear constraints. We demonstrate that an appropriate choice of parameters controls the behavior of the proposed algorithm between that of the well-known Frank-Wolfe and Rosen methods. We investigate the identification of those algorithm parameters that result in fast convergence to the solution by allowing many constraints to be added or dropped from the active set at each iteration. We show that an acceleration step based on the Fletcher-Reeves method can be easily added, and numerical results are provided for discrete optimal control problems with a large number (up to 10000) of control variables."

A Parallel Frank-Wolfe/gradient Projection Method for Optimal Control

A Parallel Frank-Wolfe/gradient Projection Method for Optimal Control PDF Author: Gerard G. L. Meyer
Publisher:
ISBN:
Category : Parallel processing (Electronic computers)
Languages : en
Pages : 30

Get Book Here

Book Description
Abstract: "We propose a new parametrized gradient projection algorithm for solving constrained large scale optimization problems, and in particular, discrete optimal control problems with linear constraints. We show that an adaptive choice of parameters results in considerable decrease in number of iterations required to reach a predetermined solution neighborhood, and that the algorithm can be implemented in parallel resulting in a further decrease in computational time. We demonstrate the efficiency of the approach by comparing implementations on both the Cray 2 and Connection Machine 2. Numerical results are provided in solving discrete optimal control problems with very high dimensionality (up to 2,000,000 variables)."

Large-scale Numerical Optimization

Large-scale Numerical Optimization PDF Author: Thomas Frederick Coleman
Publisher: SIAM
ISBN: 9780898712681
Category : Mathematics
Languages : en
Pages : 278

Get Book Here

Book Description
Papers from a workshop held at Cornell University, Oct. 1989, and sponsored by Cornell's Mathematical Sciences Institute. Annotation copyright Book News, Inc. Portland, Or.

Handbook of Optimization in Telecommunications

Handbook of Optimization in Telecommunications PDF Author: Mauricio G.C. Resende
Publisher: Springer Science & Business Media
ISBN: 0387301658
Category : Mathematics
Languages : en
Pages : 1120

Get Book Here

Book Description
This comprehensive handbook brings together experts who use optimization to solve problems that arise in telecommunications. It is the first book to cover in detail the field of optimization in telecommunications. Recent optimization developments that are frequently applied to telecommunications are covered. The spectrum of topics covered includes planning and design of telecommunication networks, routing, network protection, grooming, restoration, wireless communications, network location and assignment problems, Internet protocol, World Wide Web, and stochastic issues in telecommunications. The book’s objective is to provide a reference tool for the increasing number of scientists and engineers in telecommunications who depend upon optimization.

Large-Scale and Distributed Optimization

Large-Scale and Distributed Optimization PDF Author: Pontus Giselsson
Publisher: Springer
ISBN: 3319974785
Category : Mathematics
Languages : en
Pages : 416

Get Book Here

Book Description
This book presents tools and methods for large-scale and distributed optimization. Since many methods in "Big Data" fields rely on solving large-scale optimization problems, often in distributed fashion, this topic has over the last decade emerged to become very important. As well as specific coverage of this active research field, the book serves as a powerful source of information for practitioners as well as theoreticians. Large-Scale and Distributed Optimization is a unique combination of contributions from leading experts in the field, who were speakers at the LCCC Focus Period on Large-Scale and Distributed Optimization, held in Lund, 14th–16th June 2017. A source of information and innovative ideas for current and future research, this book will appeal to researchers, academics, and students who are interested in large-scale optimization.

First-Order Methods in Optimization

First-Order Methods in Optimization PDF Author: Amir Beck
Publisher: SIAM
ISBN: 1611974984
Category : Mathematics
Languages : en
Pages : 476

Get Book Here

Book Description
The primary goal of this book is to provide a self-contained, comprehensive study of the main ?rst-order methods that are frequently used in solving large-scale problems. First-order methods exploit information on values and gradients/subgradients (but not Hessians) of the functions composing the model under consideration. With the increase in the number of applications that can be modeled as large or even huge-scale optimization problems, there has been a revived interest in using simple methods that require low iteration cost as well as low memory storage. The author has gathered, reorganized, and synthesized (in a unified manner) many results that are currently scattered throughout the literature, many of which cannot be typically found in optimization books. First-Order Methods in Optimization offers comprehensive study of first-order methods with the theoretical foundations; provides plentiful examples and illustrations; emphasizes rates of convergence and complexity analysis of the main first-order methods used to solve large-scale problems; and covers both variables and functional decomposition methods.

Gradient Methods for Large-scale Nonlinear Optimization

Gradient Methods for Large-scale Nonlinear Optimization PDF Author: Hongchao Zhang
Publisher:
ISBN:
Category :
Languages : en
Pages :

Get Book Here

Book Description
Finally, we propose a class of self-adaptive proximal point methods suitable for degenerate optimization problems where multiple minimizers may exist, or where the Hessian may be singular at a local minimizer. Two different acceptance criteria for an approximate solution to the proximal problem is analyzed and the convergence rate are analogous to those of exact iterates. The second part of this dissertation discusses using gradient methods to solve large-scale box constrained optimization. We first discuss the gradient projection methods. Then, an active set algorithm (ASA) for box constrained optimization is developed. The algorithm consists of a nonmonotone gradient projection step, an unconstrained optimization step, and a set of rules for branching between the two steps. Global convergence to a stationary point is established. Under the strong second-order sufficient optimality condition, without assuming strict complementarity, the algorithm eventually reduces to unconstrained optimization without restarts. For strongly convex quadratic box constrained optimization, ASA is shown to have finite convergence when a conjugate gradient method is used in the unconstrained optimization step. A specific implementation of ASA is given, which exploits the cyclic Barzilai-Borwein algorithm for the gradient projection step and CG_DESCENT for unconstrained optimization. Numerical experiments using the box constrained problems in the CUTEr and MINPACK test problem libraries show that this new algorithm outperforms benchmark softwares such as GENCAN, L-BFGS-B, and TRON.

Modern Numerical Nonlinear Optimization

Modern Numerical Nonlinear Optimization PDF Author: Neculai Andrei
Publisher: Springer Nature
ISBN: 3031087208
Category : Mathematics
Languages : en
Pages : 824

Get Book Here

Book Description
This book includes a thorough theoretical and computational analysis of unconstrained and constrained optimization algorithms and combines and integrates the most recent techniques and advanced computational linear algebra methods. Nonlinear optimization methods and techniques have reached their maturity and an abundance of optimization algorithms are available for which both the convergence properties and the numerical performances are known. This clear, friendly, and rigorous exposition discusses the theory behind the nonlinear optimization algorithms for understanding their properties and their convergence, enabling the reader to prove the convergence of his/her own algorithms. It covers cases and computational performances of the most known modern nonlinear optimization algorithms that solve collections of unconstrained and constrained optimization test problems with different structures, complexities, as well as those with large-scale real applications. The book is addressed to all those interested in developing and using new advanced techniques for solving large-scale unconstrained or constrained complex optimization problems. Mathematical programming researchers, theoreticians and practitioners in operations research, practitioners in engineering and industry researchers, as well as graduate students in mathematics, Ph.D. and master in mathematical programming will find plenty of recent information and practical approaches for solving real large-scale optimization problems and applications.

Gradient Type Methods for Large Scale Optimization

Gradient Type Methods for Large Scale Optimization PDF Author: Mahboubeh Farid
Publisher: LAP Lambert Academic Publishing
ISBN: 9783844319682
Category :
Languages : en
Pages : 104

Get Book Here

Book Description
The focus of this book is on finding the unconstrained minimizer of a function. Specifically, we will focus on the Barzilai and Borwein (BB) method that is a famous two-point stepsize gradient method. Due to BB method s simplicity, low storage and numerical efficiency, the BB method has received a good deal of attention in the optimization community but despite all these advances, stepsize of BB method is computed by means of simple approximation of Hessian in the form of scalar multiple of identity and especially the BB method is not monotone, and it is not easy to generalize the method to general nonlinear functions. Due to the presence of these deficiencies, we introduce new gradient-type methods in the frame of BB method including a new gradient method via weak secant equation, improved Hessian approximation and scaling the diagonal updating. Our proposed methods consider approximation of Hessian in diagonal matrix. Incorporate with monotone strategies, the resulting algorithms belong to the class of monotone gradient methods with globally convergence. Numerical results suggest that for non-quadratic minimization problem, the new methods clearly outperform the BB method.