Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Computational methods for sparse solution of linear inverse problems
The goal of the sparse approximation problem is to approximate a target signal using a
linear combination of a few elementary signals drawn from a fixed collection. This paper …
linear combination of a few elementary signals drawn from a fixed collection. This paper …
Minimization of for Compressed Sensing
We study minimization of the difference of \ell_1 and \ell_2 norms as a nonconvex and
Lipschitz continuous metric for solving constrained and unconstrained compressed sensing …
Lipschitz continuous metric for solving constrained and unconstrained compressed sensing …
NESTA: A fast and accurate first-order method for sparse recovery
Accurate signal recovery or image reconstruction from indirect and possibly undersampled
data is a topic of considerable interest; for example, the literature in the recent field of …
data is a topic of considerable interest; for example, the literature in the recent field of …
Fixed point and Bregman iterative methods for matrix rank minimization
S Ma, D Goldfarb, L Chen - Mathematical Programming, 2011 - Springer
The linearly constrained matrix rank minimization problem is widely applicable in many
fields such as control, signal processing and system identification. The tightest convex …
fields such as control, signal processing and system identification. The tightest convex …
Barzilai-Borwein step size for stochastic gradient descent
One of the major issues in stochastic gradient descent (SGD) methods is how to choose an
appropriate step size while running the algorithm. Since the traditional line search technique …
appropriate step size while running the algorithm. Since the traditional line search technique …
Proximal Newton-type methods for minimizing composite functions
We generalize Newton-type methods for minimizing smooth functions to handle a sum of two
convex functions: a smooth function and a nonsmooth function with a simple proximal …
convex functions: a smooth function and a nonsmooth function with a simple proximal …
Templates for convex cone problems with applications to sparse signal recovery
This paper develops a general framework for solving a variety of convex cone problems that
frequently arise in signal processing, machine learning, statistics, and other fields. The …
frequently arise in signal processing, machine learning, statistics, and other fields. The …
A highly efficient semismooth Newton augmented Lagrangian method for solving Lasso problems
We develop a fast and robust algorithm for solving large-scale convex composite
optimization models with an emphasis on the \ell_1-regularized least squares regression …
optimization models with an emphasis on the \ell_1-regularized least squares regression …
Parallel coordinate descent for l1-regularized loss minimization
We propose Shotgun, a parallel coordinate descent algorithm for minimizing L1-regularized
losses. Though coordinate descent seems inherently sequential, we prove convergence …
losses. Though coordinate descent seems inherently sequential, we prove convergence …
Forward–backward quasi-Newton methods for nonsmooth optimization problems
The forward–backward splitting method (FBS) for minimizing a nonsmooth composite
function can be interpreted as a (variable-metric) gradient method over a continuously …
function can be interpreted as a (variable-metric) gradient method over a continuously …