Computational methods for sparse solution of linear inverse problems

JA Tropp, SJ Wright - Proceedings of the IEEE, 2010 - ieeexplore.ieee.org
The goal of the sparse approximation problem is to approximate a target signal using a
linear combination of a few elementary signals drawn from a fixed collection. This paper …

Minimization of for Compressed Sensing

P Yin, Y Lou, Q He, J **n - SIAM Journal on Scientific Computing, 2015 - SIAM
We study minimization of the difference of \ell_1 and \ell_2 norms as a nonconvex and
Lipschitz continuous metric for solving constrained and unconstrained compressed sensing …

NESTA: A fast and accurate first-order method for sparse recovery

S Becker, J Bobin, EJ Candès - SIAM Journal on Imaging Sciences, 2011 - SIAM
Accurate signal recovery or image reconstruction from indirect and possibly undersampled
data is a topic of considerable interest; for example, the literature in the recent field of …

Fixed point and Bregman iterative methods for matrix rank minimization

S Ma, D Goldfarb, L Chen - Mathematical Programming, 2011 - Springer
The linearly constrained matrix rank minimization problem is widely applicable in many
fields such as control, signal processing and system identification. The tightest convex …

Barzilai-Borwein step size for stochastic gradient descent

C Tan, S Ma, YH Dai, Y Qian - Advances in neural …, 2016 - proceedings.neurips.cc
One of the major issues in stochastic gradient descent (SGD) methods is how to choose an
appropriate step size while running the algorithm. Since the traditional line search technique …

Proximal Newton-type methods for minimizing composite functions

JD Lee, Y Sun, MA Saunders - SIAM Journal on Optimization, 2014 - SIAM
We generalize Newton-type methods for minimizing smooth functions to handle a sum of two
convex functions: a smooth function and a nonsmooth function with a simple proximal …

Templates for convex cone problems with applications to sparse signal recovery

SR Becker, EJ Candès, MC Grant - Mathematical programming …, 2011 - Springer
This paper develops a general framework for solving a variety of convex cone problems that
frequently arise in signal processing, machine learning, statistics, and other fields. The …

A highly efficient semismooth Newton augmented Lagrangian method for solving Lasso problems

X Li, D Sun, KC Toh - SIAM Journal on Optimization, 2018 - SIAM
We develop a fast and robust algorithm for solving large-scale convex composite
optimization models with an emphasis on the \ell_1-regularized least squares regression …

Parallel coordinate descent for l1-regularized loss minimization

JK Bradley, A Kyrola, D Bickson, C Guestrin - arxiv preprint arxiv …, 2011 - arxiv.org
We propose Shotgun, a parallel coordinate descent algorithm for minimizing L1-regularized
losses. Though coordinate descent seems inherently sequential, we prove convergence …

Forward–backward quasi-Newton methods for nonsmooth optimization problems

L Stella, A Themelis, P Patrinos - Computational Optimization and …, 2017 - Springer
The forward–backward splitting method (FBS) for minimizing a nonsmooth composite
function can be interpreted as a (variable-metric) gradient method over a continuously …