A Scaled Gradient Projection Method for Large Scale Optimization

A Scaled Gradient Projection Method for Large Scale Optimization PDF Author: Gerard G. L. Meyer
Publisher:
ISBN:
Category : Control theory
Languages : en
Pages : 26

Get Book Here

Book Description
Abstract: "We propose a new parametrized gradient projection algorithm for solving constrained large scale optimization problems and, in particular, discrete optimal control problems with linear constraints. We demonstrate that an appropriate choice of parameters controls the behavior of the proposed algorithm between that of the well-known Frank-Wolfe and Rosen methods. We investigate the identification of those algorithm parameters that result in fast convergence to the solution by allowing many constraints to be added or dropped from the active set at each iteration. We show that an acceleration step based on the Fletcher-Reeves method can be easily added, and numerical results are provided for discrete optimal control problems with a large number (up to 10000) of control variables."

A Scaled Gradient Projection Method for Large Scale Optimization

A Scaled Gradient Projection Method for Large Scale Optimization PDF Author: Gerard G. L. Meyer
Publisher:
ISBN:
Category : Control theory
Languages : en
Pages : 26

Get Book Here

Book Description
Abstract: "We propose a new parametrized gradient projection algorithm for solving constrained large scale optimization problems and, in particular, discrete optimal control problems with linear constraints. We demonstrate that an appropriate choice of parameters controls the behavior of the proposed algorithm between that of the well-known Frank-Wolfe and Rosen methods. We investigate the identification of those algorithm parameters that result in fast convergence to the solution by allowing many constraints to be added or dropped from the active set at each iteration. We show that an acceleration step based on the Fletcher-Reeves method can be easily added, and numerical results are provided for discrete optimal control problems with a large number (up to 10000) of control variables."

A Frank-Wolfe/gradient Projection Method for Large Scale Optimization

A Frank-Wolfe/gradient Projection Method for Large Scale Optimization PDF Author: Institute for Defense Analyses. Supercomputing Research Center
Publisher:
ISBN:
Category : Algorithms
Languages : en
Pages : 31

Get Book Here

Book Description
We further show that in the case of a quadratic objective, a Fletcher-Reeves type conjugate gradient modification for manifold suboptimization results in the algorithm converging to a nondegenerate solution point in a finite number of iterations. Numerical results run on the Sun4 and a single processor of the Cray2 are provided for discrete optimal control problems with a large number (up to 10,000) of control variables and compared against existing results."

Large Scale Optimization

Large Scale Optimization PDF Author: William W. Hager
Publisher: Springer Science & Business Media
ISBN: 1461336325
Category : Mathematics
Languages : en
Pages : 470

Get Book Here

Book Description
On February 15-17, 1993, a conference on Large Scale Optimization, hosted by the Center for Applied Optimization, was held at the University of Florida. The con ference was supported by the National Science Foundation, the U. S. Army Research Office, and the University of Florida, with endorsements from SIAM, MPS, ORSA and IMACS. Forty one invited speakers presented papers on mathematical program ming and optimal control topics with an emphasis on algorithm development, real world applications and numerical results. Participants from Canada, Japan, Sweden, The Netherlands, Germany, Belgium, Greece, and Denmark gave the meeting an important international component. At tendees also included representatives from IBM, American Airlines, US Air, United Parcel Serice, AT & T Bell Labs, Thinking Machines, Army High Performance Com puting Research Center, and Argonne National Laboratory. In addition, the NSF sponsored attendance of thirteen graduate students from universities in the United States and abroad. Accurate modeling of scientific problems often leads to the formulation of large scale optimization problems involving thousands of continuous and/or discrete vari ables. Large scale optimization has seen a dramatic increase in activities in the past decade. This has been a natural consequence of new algorithmic developments and of the increased power of computers. For example, decomposition ideas proposed by G. Dantzig and P. Wolfe in the 1960's, are now implement able in distributed process ing systems, and today many optimization codes have been implemented on parallel machines.

Improved Scaled and Shifted Conjugate Gradient Methods for Large-scale Unconstrained Optimization

Improved Scaled and Shifted Conjugate Gradient Methods for Large-scale Unconstrained Optimization PDF Author: Amal Ahmed Al-Saidiyah
Publisher:
ISBN:
Category :
Languages : en
Pages : 0

Get Book Here

Book Description


Gradient Type Methods for Large Scale Optimization

Gradient Type Methods for Large Scale Optimization PDF Author: Mahboubeh Farid
Publisher: LAP Lambert Academic Publishing
ISBN: 9783844319682
Category :
Languages : en
Pages : 104

Get Book Here

Book Description
The focus of this book is on finding the unconstrained minimizer of a function. Specifically, we will focus on the Barzilai and Borwein (BB) method that is a famous two-point stepsize gradient method. Due to BB method s simplicity, low storage and numerical efficiency, the BB method has received a good deal of attention in the optimization community but despite all these advances, stepsize of BB method is computed by means of simple approximation of Hessian in the form of scalar multiple of identity and especially the BB method is not monotone, and it is not easy to generalize the method to general nonlinear functions. Due to the presence of these deficiencies, we introduce new gradient-type methods in the frame of BB method including a new gradient method via weak secant equation, improved Hessian approximation and scaling the diagonal updating. Our proposed methods consider approximation of Hessian in diagonal matrix. Incorporate with monotone strategies, the resulting algorithms belong to the class of monotone gradient methods with globally convergence. Numerical results suggest that for non-quadratic minimization problem, the new methods clearly outperform the BB method.

Gradient Methods for Large-scale Nonlinear Optimization

Gradient Methods for Large-scale Nonlinear Optimization PDF Author: Hongchao Zhang
Publisher:
ISBN:
Category :
Languages : en
Pages :

Get Book Here

Book Description
Finally, we propose a class of self-adaptive proximal point methods suitable for degenerate optimization problems where multiple minimizers may exist, or where the Hessian may be singular at a local minimizer. Two different acceptance criteria for an approximate solution to the proximal problem is analyzed and the convergence rate are analogous to those of exact iterates. The second part of this dissertation discusses using gradient methods to solve large-scale box constrained optimization. We first discuss the gradient projection methods. Then, an active set algorithm (ASA) for box constrained optimization is developed. The algorithm consists of a nonmonotone gradient projection step, an unconstrained optimization step, and a set of rules for branching between the two steps. Global convergence to a stationary point is established. Under the strong second-order sufficient optimality condition, without assuming strict complementarity, the algorithm eventually reduces to unconstrained optimization without restarts. For strongly convex quadratic box constrained optimization, ASA is shown to have finite convergence when a conjugate gradient method is used in the unconstrained optimization step. A specific implementation of ASA is given, which exploits the cyclic Barzilai-Borwein algorithm for the gradient projection step and CG_DESCENT for unconstrained optimization. Numerical experiments using the box constrained problems in the CUTEr and MINPACK test problem libraries show that this new algorithm outperforms benchmark softwares such as GENCAN, L-BFGS-B, and TRON.

A Dual Gradient-projection Method for Large-scale Strictly Convex Quadratic Problems

A Dual Gradient-projection Method for Large-scale Strictly Convex Quadratic Problems PDF Author:
Publisher:
ISBN:
Category :
Languages : en
Pages :

Get Book Here

Book Description


Nonlinear Programming

Nonlinear Programming PDF Author: Dimitri Bertsekas
Publisher: Athena Scientific
ISBN: 1886529051
Category : Mathematics
Languages : en
Pages : 1100

Get Book Here

Book Description
This book provides a comprehensive and accessible presentation of algorithms for solving continuous optimization problems. It relies on rigorous mathematical analysis, but also aims at an intuitive exposition that makes use of visualization where possible. It places particular emphasis on modern developments, and their widespread applications in fields such as large-scale resource allocation problems, signal processing, and machine learning. The 3rd edition brings the book in closer harmony with the companion works Convex Optimization Theory (Athena Scientific, 2009), Convex Optimization Algorithms (Athena Scientific, 2015), Convex Analysis and Optimization (Athena Scientific, 2003), and Network Optimization (Athena Scientific, 1998). These works are complementary in that they deal primarily with convex, possibly nondifferentiable, optimization problems and rely on convex analysis. By contrast the nonlinear programming book focuses primarily on analytical and computational methods for possibly nonconvex differentiable problems. It relies primarily on calculus and variational analysis, yet it still contains a detailed presentation of duality theory and its uses for both convex and nonconvex problems. This on-line edition contains detailed solutions to all the theoretical book exercises. Among its special features, the book: Provides extensive coverage of iterative optimization methods within a unifying framework Covers in depth duality theory from both a variational and a geometric point of view Provides a detailed treatment of interior point methods for linear programming Includes much new material on a number of topics, such as proximal algorithms, alternating direction methods of multipliers, and conic programming Focuses on large-scale optimization topics of much current interest, such as first order methods, incremental methods, and distributed asynchronous computation, and their applications in machine learning, signal processing, neural network training, and big data applications Includes a large number of examples and exercises Was developed through extensive classroom use in first-year graduate courses

Large Scale Linear and Integer Optimization: A Unified Approach

Large Scale Linear and Integer Optimization: A Unified Approach PDF Author: Richard Kipp Martin
Publisher: Springer Science & Business Media
ISBN: 1461549752
Category : Business & Economics
Languages : en
Pages : 739

Get Book Here

Book Description
This is a textbook about linear and integer linear optimization. There is a growing need in industries such as airline, trucking, and financial engineering to solve very large linear and integer linear optimization problems. Building these models requires uniquely trained individuals. Not only must they have a thorough understanding of the theory behind mathematical programming, they must have substantial knowledge of how to solve very large models in today's computing environment. The major goal of the book is to develop the theory of linear and integer linear optimization in a unified manner and then demonstrate how to use this theory in a modern computing environment to solve very large real world problems. After presenting introductory material in Part I, Part II of this book is de voted to the theory of linear and integer linear optimization. This theory is developed using two simple, but unifying ideas: projection and inverse projec tion. Through projection we take a system of linear inequalities and replace some of the variables with additional linear inequalities. Inverse projection, the dual of this process, involves replacing linear inequalities with additional variables. Fundamental results such as weak and strong duality, theorems of the alternative, complementary slackness, sensitivity analysis, finite basis the orems, etc. are all explained using projection or inverse projection. Indeed, a unique feature of this book is that these fundamental results are developed and explained before the simplex and interior point algorithms are presented.

Introduction to Optimum Design

Introduction to Optimum Design PDF Author: Jasbir Singh Arora
Publisher: Elsevier
ISBN: 0128183217
Category : Technology & Engineering
Languages : en
Pages : 1121

Get Book Here

Book Description
Introduction to Optimum Design, Fifth Edition is the most widely used textbook in engineering optimization and optimum design courses. It is intended for use in a first course on engineering design and optimization at the undergraduate or graduate level within engineering departments of all disciplines, but primarily within mechanical, aerospace and civil engineering. The basic approach of the text presents an organized approach to engineering design optimization in a rigorous yet simplified manner, illustrating various concepts and procedures with simple examples and demonstrating their applicability to engineering design problems. Formulation of a design problem as an optimization problem is emphasized and illustrated throughout the text. Excel and MATLAB are featured as learning and teaching aids. This new edition has been enhanced with new or expanded content in such areas as reliability-based optimization, metamodeling, design of experiments, robust design, nature-inspired metaheuristic search methods, and combinatorial optimizaton. - Describes basic concepts of optimality conditions and numerical methods with simple and practical examples, making the material highly teachable and learnable - Includes applications of optimization methods for structural, mechanical, aerospace, and industrial engineering problems - Covers practical design examples and introduces students to the use of optimization methods - Serves the needs of instructors who teach more advanced courses - Features new or expanded contents in such areas as design under uncertainty - reliability-based design optimization, metamodeling - response surface method, design of experiments, nature-inspired metaheuristic search methods, and robust design