From error bounds to the complexity of first-order descent methods for convex functions

J Bolte, TP Nguyen, J Peypouquet, BW Suter - Mathematical Programming, 2017 - Springer
This paper shows that error bounds can be used as effective tools for deriving complexity
results for first-order descent methods in convex minimization. In a first stage, this objective …

A single-loop smoothed gradient descent-ascent algorithm for nonconvex-concave min-max problems

J Zhang, P **ao, R Sun, Z Luo - Advances in neural …, 2020 - proceedings.neurips.cc
Nonconvex-concave min-max problem arises in many machine learning applications
including minimizing a pointwise maximum of a set of nonconvex functions and robust …

Rate of convergence of the Nesterov accelerated gradient method in the subcritical case α≤ 3

H Attouch, Z Chbani, H Riahi - ESAIM: Control, Optimisation and …, 2019 - esaim-cocv.org
In a Hilbert space setting ℋ, given Φ: ℋ→ ℝ a convex continuously differentiable function,
and α a positive parameter, we consider the inertial dynamic system with Asymptotic …

Are we there yet? manifold identification of gradient-related proximal methods

Y Sun, H Jeong, J Nutini… - The 22nd International …, 2019 - proceedings.mlr.press
In machine learning, models that generalize better often generate outputs that lie on a low-
dimensional manifold. Recently, several works have separately shown finite-time manifold …

Gap safe screening rules for sparsity enforcing penalties

E Ndiaye, O Fercoq, J Salmon - Journal of Machine Learning Research, 2017 - jmlr.org
In high dimensional regression settings, sparsity enforcing penalties have proved useful to
regularize the data-fitting term. A recently introduced technique called screening rules …

[BOOK][B] Sparse image and signal processing: Wavelets and related geometric multiscale analysis

JL Starck, F Murtagh, J Fadili - 2015 - books.google.com
This thoroughly updated new edition presents state of the art sparse and multiscale image
and signal processing. It covers linear multiscale geometric transforms, such as wavelet …

Mind the duality gap: safer rules for the lasso

O Fercoq, A Gramfort, J Salmon - … conference on machine …, 2015 - proceedings.mlr.press
Screening rules allow to early discard irrelevant variables from the optimization in Lasso
problems, or its derivatives, making solvers faster. In this paper, we propose new versions of …

Convergence rates of inertial forward-backward algorithms

H Attouch, A Cabot - SIAM Journal on Optimization, 2018 - SIAM
In a Hilbert space \mathcalH, assuming (\alpha_k) a general sequence of nonnegative
numbers, we analyze the convergence properties of the inertial forward-backward algorithm …

Convergence of a relaxed inertial forward–backward algorithm for structured monotone inclusions

H Attouch, A Cabot - Applied Mathematics & Optimization, 2019 - Springer
In a Hilbert space HH, we study the convergence properties of a class of relaxed inertial
forward–backward algorithms. They aim to solve structured monotone inclusions of the form …

Asymptotic stabilization of inertial gradient dynamics with time-dependent viscosity

H Attouch, A Cabot - Journal of Differential Equations, 2017 - Elsevier
In a Hilbert space H, we study the asymptotic behavior, as time variable t goes to+∞, of
nonautonomous gradient-like inertial dynamics, with a time-dependent viscosity coefficient …