Sparsity in deep learning: Pruning and growth for efficient inference and training in neural networks
The growing energy and performance costs of deep learning have driven the community to
reduce the size of neural networks by selectively pruning components. Similarly to their …
reduce the size of neural networks by selectively pruning components. Similarly to their …
Implicit bias of gradient descent on linear convolutional networks
We show that gradient descent on full-width linear convolutional networks of depth $ L $
converges to a linear predictor related to the $\ell_ {2/L} $ bridge penalty in the frequency …
converges to a linear predictor related to the $\ell_ {2/L} $ bridge penalty in the frequency …
[KNJIGA][B] An invitation to compressive sensing
This first chapter formulates the objectives of compressive sensing. It introduces the
standard compressive problem studied throughout the book and reveals its ubiquity in many …
standard compressive problem studied throughout the book and reveals its ubiquity in many …
Accelerated methods for nonconvex optimization
We present an accelerated gradient method for nonconvex optimization problems with
Lipschitz continuous first and second derivatives. In a time O(ϵ^-7/4\log(1/ϵ)), the method …
Lipschitz continuous first and second derivatives. In a time O(ϵ^-7/4\log(1/ϵ)), the method …
Robust and explainable autoencoders for unsupervised time series outlier detection
Time series data occurs widely, and outlier detection is a fundamental problem in data
mining, which has numerous applications. Existing autoencoder-based approaches deliver …
mining, which has numerous applications. Existing autoencoder-based approaches deliver …
A survey on nonconvex regularization-based sparse and low-rank recovery in signal processing, statistics, and machine learning
In the past decade, sparse and low-rank recovery has drawn much attention in many areas
such as signal/image processing, statistics, bioinformatics, and machine learning. To …
such as signal/image processing, statistics, bioinformatics, and machine learning. To …
Improved Iteratively Reweighted Least Squares for Unconstrained Smoothed Minimization
In this paper, we first study \ell_q minimization and its associated iterative reweighted
algorithm for recovering sparse vectors. Unlike most existing work, we focus on …
algorithm for recovering sparse vectors. Unlike most existing work, we focus on …
Smoothing methods for nonsmooth, nonconvex minimization
X Chen - Mathematical programming, 2012 - Springer
We consider a class of smoothing methods for minimization problems where the feasible set
is convex but the objective function is not convex, not differentiable and perhaps not even …
is convex but the objective function is not convex, not differentiable and perhaps not even …
Recent advances in mathematical programming with semi-continuous variables and cardinality constraint
X Sun, X Zheng, D Li - Journal of the Operations Research Society of …, 2013 - Springer
Mathematical programming problems with semi-continuous variables and cardinality
constraint have many applications, including production planning, portfolio selection …
constraint have many applications, including production planning, portfolio selection …
Inference for high-dimensional sparse econometric models
We consider linear, high-dimensional sparse (HDS) regression models in econometrics. The
HDS regression model allows for a large number of regressors, p, which is possibly much …
HDS regression model allows for a large number of regressors, p, which is possibly much …