A Family of Sparsity-Promoting Gradient Descent Algorithms Based on Sparse Signal Recovery

A Family of Sparsity-Promoting Gradient Descent Algorithms Based on Sparse Signal Recovery PDF Author: Ching-Hua Lee
Publisher:
ISBN:
Category :
Languages : en
Pages : 168

Get Book Here

Book Description
Sparsity has played an important role in numerous signal processing systems. By leveraging sparse representations of signals, many batch estimation algorithms and methods that are efficient, robust, and effective for practical engineering problems have been developed. However, gradient descent-based approaches that are less computationally expensive have become essential to the development of modern machine learning systems, especially the deep neural networks (DNNs). This dissertation examines how we can incorporate sparsity principles into gradient-based learning algorithms, in both signal processing and machine learning applications, for improved estimation and optimization performance. On the signal processing side, we study how to take advantage of sparsity in the system response for improving the convergence rate of the least mean square (LMS) family of adaptive filters, which are derived from using gradient descent on the mean square error objective function. Based on iterative reweighting sparse signal recovery (SSR) techniques, we propose a novel framework for deriving a class of sparsity-aware LMS algorithms by adopting an affine scaling transformation (AST) methodology in the algorithm design process. Sparsity-promoting LMS (SLMS) and Sparsity-promoting Normalized LMS (SNLMS) algorithms are introduced, which can take advantage of, though do not strictly enforce, the sparsity of the underlying system if it already exists for convergence speedup. In addition, the reweighting-AST framework is applied to the conjugate gradient (CG) class of adaptive algorithms, which in general demonstrate a much higher convergence rate than the LMS family. The resulting Sparsity-promoting CG (SCG) algorithm also demonstrates improved convergence characteristics for sparse system identification. Finally, the proposed algorithms are applied to the real-world problem of acoustic feedback reduction encountered in hearing aids. On the machine learning side, we investigate how to exploit the SSR techniques in gradient-based optimization algorithms for learning compact representations in nonlinear estimation tasks, especially with overparameterized models. In particular, the reweighting-AST framework is utilized in the context of estimating a regularized solution exhibiting some desired properties such as sparsity without having to incorporate a regularization penalty. The resulting algorithms in general have a weighted gradient term in the update equation where the weighting matrix provides certain implicit regularization capabilities. We start by establishing a general framework that can possibly extend to various regularizers and then focus on the sparsity regularization aspect. As notable applications of nonlinear model sparsification, we propose i) Sparsity-promoting Stochastic Gradient Descent (SSGD) algorithms for DNN compression and ii) Sparsity-promoting Kernel LMS (SKLMS) and Sparsity-promoting Kernel NLMS (SKNLMS) algorithms for dictionary pruning in kernel methods.

Algorithms for Sparsity-Constrained Optimization

Algorithms for Sparsity-Constrained Optimization PDF Author: Sohail Bahmani
Publisher: Springer Science & Business Media
ISBN: 3319018817
Category : Technology & Engineering
Languages : en
Pages : 124

Get Book Here

Book Description
This thesis demonstrates techniques that provide faster and more accurate solutions to a variety of problems in machine learning and signal processing. The author proposes a "greedy" algorithm, deriving sparse solutions with guarantees of optimality. The use of this algorithm removes many of the inaccuracies that occurred with the use of previous models.

Optimization with Sparsity-Inducing Penalties

Optimization with Sparsity-Inducing Penalties PDF Author: Francis Bach
Publisher:
ISBN: 9781601985101
Category : Computers
Languages : en
Pages : 124

Get Book Here

Book Description
Sparse estimation methods are aimed at using or obtaining parsimonious representations of data or models. They were first dedicated to linear variable selection but numerous extensions have now emerged such as structured sparsity or kernel selection. It turns out that many of the related estimation problems can be cast as convex optimization problems by regularizing the empirical risk with appropriate nonsmooth norms. Optimization with Sparsity-Inducing Penalties presents optimization tools and techniques dedicated to such sparsity-inducing penalties from a general perspective. It covers proximal methods, block-coordinate descent, reweighted ?2-penalized techniques, working-set and homotopy methods, as well as non-convex formulations and extensions, and provides an extensive set of experiments to compare various algorithms from a computational point of view. The presentation of Optimization with Sparsity-Inducing Penalties is essentially based on existing literature, but the process of constructing a general framework leads naturally to new results, connections and points of view. It is an ideal reference on the topic for anyone working in machine learning and related areas.

Algorithm Development for Sparse Signal Recovery and Performance Limits Using Multiple-user Information Theory

Algorithm Development for Sparse Signal Recovery and Performance Limits Using Multiple-user Information Theory PDF Author: Yuzhe Jin
Publisher:
ISBN: 9781124705613
Category :
Languages : en
Pages : 224

Get Book Here

Book Description


Sparsity Methods for Systems and Control

Sparsity Methods for Systems and Control PDF Author: Masaaki Nagahara
Publisher:
ISBN: 9781680837247
Category :
Languages : en
Pages : 220

Get Book Here

Book Description
The method of sparsity has been attracting a lot of attention in the fields related not only to signal processing, machine learning, and statistics, but also systems and control. The method is known as compressed sensing, compressive sampling, sparse representation, or sparse modeling. More recently, the sparsity method has been applied to systems and control to design resource-aware control systems. This book gives a comprehensive guide to sparsity methods for systems and control, from standard sparsity methods in finite-dimensional vector spaces (Part I) to optimal control methods in infinite-dimensional function spaces (Part II). The primary objective of this book is to show how to use sparsity methods for several engineering problems. For this, the author provides MATLAB programs by which the reader can try sparsity methods for themselves. Readers will obtain a deep understanding of sparsity methods by running these MATLAB programs. Sparsity Methods for Systems and Control is suitable for graduate level university courses, though it should also be comprehendible to undergraduate students who have a basic knowledge of linear algebra and elementary calculus. Also, especially part II of the book should appeal to professional researchers and engineers who are interested in applying sparsity methods to systems and control.

Sparsity Pattern Recovery in Compressed Sensing

Sparsity Pattern Recovery in Compressed Sensing PDF Author: Galen Reeves
Publisher:
ISBN:
Category :
Languages : en
Pages : 250

Get Book Here

Book Description
The problem of recovering sparse signals from a limited number of measurements is now ubiquitous in signal processing, statistics, and machine learning. A natural question of fundamental interest is that of what can and cannot be recovered in the presence of noise. This thesis provides a sharp characterization for the task of sparsity pattern recovery (also known as support recovery). Using tools from information theory, we find a sharp separation into two problem regimes -- one in which the problem is fundamentally noise-limited, and a more interesting one in which the problem is limited by the behavior of the sparse components themselves. This analysis allows us to identify settings where existing computationally efficient algorithms, such as the LASSO or approximate message passing, are optimal as well as other settings where these algorithms are highly suboptimal. We compare our results to predictions of phase transitions made via the powerful but heuristic replica method, and find that our rigorous bounds confirm some of these predictions. The remainder of the thesis explores extensions of our bounds to various scenarios. We consider specially structured sampling matrices and show how such additional structure can make a key difference, analogous to the role of diversity in wireless communications. Finally, we illustrate how the new bounding techniques introduced in this thesis can be used to establish information-theoretic secrecy results for certain communication channel models that involve eavesdroppers.

Sparse Recovery

Sparse Recovery PDF Author: Meng Wang
Publisher:
ISBN:
Category :
Languages : en
Pages : 182

Get Book Here

Book Description


Sparse Modeling for Image and Vision Processing

Sparse Modeling for Image and Vision Processing PDF Author: Julien Mairal
Publisher: Now Publishers
ISBN: 9781680830088
Category : Computers
Languages : en
Pages : 216

Get Book Here

Book Description
Sparse Modeling for Image and Vision Processing offers a self-contained view of sparse modeling for visual recognition and image processing. More specifically, it focuses on applications where the dictionary is learned and adapted to data, yielding a compact representation that has been successful in various contexts.

Statistical Learning with Sparsity

Statistical Learning with Sparsity PDF Author: Trevor Hastie
Publisher: CRC Press
ISBN: 1498712177
Category : Business & Economics
Languages : en
Pages : 354

Get Book Here

Book Description
Discover New Methods for Dealing with High-Dimensional DataA sparse statistical model has only a small number of nonzero parameters or weights; therefore, it is much easier to estimate and interpret than a dense model. Statistical Learning with Sparsity: The Lasso and Generalizations presents methods that exploit sparsity to help recover the underl

The Sparse Fourier Transform

The Sparse Fourier Transform PDF Author: Haitham Hassanieh
Publisher: Morgan & Claypool
ISBN: 1947487051
Category : Computers
Languages : en
Pages : 279

Get Book Here

Book Description
The Fourier transform is one of the most fundamental tools for computing the frequency representation of signals. It plays a central role in signal processing, communications, audio and video compression, medical imaging, genomics, astronomy, as well as many other areas. Because of its widespread use, fast algorithms for computing the Fourier transform can benefit a large number of applications. The fastest algorithm for computing the Fourier transform is the Fast Fourier Transform (FFT), which runs in near-linear time making it an indispensable tool for many applications. However, today, the runtime of the FFT algorithm is no longer fast enough especially for big data problems where each dataset can be few terabytes. Hence, faster algorithms that run in sublinear time, i.e., do not even sample all the data points, have become necessary. This book addresses the above problem by developing the Sparse Fourier Transform algorithms and building practical systems that use these algorithms to solve key problems in six different applications: wireless networks; mobile systems; computer graphics; medical imaging; biochemistry; and digital circuits. This is a revised version of the thesis that won the 2016 ACM Doctoral Dissertation Award.