A descent lemma beyond Lipschitz gradient continuity: first-order methods revisited and applications

HH Bauschke, J Bolte… - Mathematics of Operations …, 2017 - pubsonline.informs.org
The proximal gradient and its variants is one of the most attractive first-order algorithm for
minimizing the sum of two convex functions, with one being nonsmooth. However, it requires …

Acceleration methods

A d'Aspremont, D Scieur, A Taylor - Foundations and Trends® …, 2021 - nowpublishers.com
This monograph covers some recent advances in a range of acceleration techniques
frequently used in convex optimization. We first use quadratic optimization problems to …

Adaptation, learning, and optimization over networks

AH Sayed - Foundations and Trends® in Machine Learning, 2014 - nowpublishers.com
This work deals with the topic of information processing over graphs. The presentation is
largely self-contained and covers results that relate to the analysis and design of multi-agent …

Adaptive restart for accelerated gradient schemes

B O'donoghue, E Candes - Foundations of computational mathematics, 2015 - Springer
In this paper we introduce a simple heuristic adaptive restart technique that can dramatically
improve the convergence rate of accelerated gradient schemes. The analysis of the …

Mini-batch stochastic approximation methods for nonconvex stochastic composite optimization

S Ghadimi, G Lan, H Zhang - Mathematical Programming, 2016 - Springer
This paper considers a class of constrained stochastic composite optimization problems
whose objective function is given by the summation of a differentiable (possibly nonconvex) …

Phase retrieval from coded diffraction patterns

EJ Candes, X Li, M Soltanolkotabi - Applied and Computational Harmonic …, 2015 - Elsevier
This paper considers the question of recovering the phase of an object from intensity-only
measurements, a problem which naturally appears in X-ray crystallography and related …

First order methods beyond convexity and Lipschitz gradient continuity with applications to quadratic inverse problems

J Bolte, S Sabach, M Teboulle, Y Vaisbourd - SIAM Journal on Optimization, 2018 - SIAM
We focus on nonconvex and nonsmooth minimization problems with a composite objective,
where the differentiable part of the objective is freed from the usual and restrictive global …

Dual averaging method for regularized stochastic learning and online optimization

L **ao - Advances in Neural Information Processing …, 2009 - proceedings.neurips.cc
We consider regularized stochastic learning and online optimization problems, where the
objective function is the sum of two convex terms: one is the loss function of the learning …

Efficiency of minimizing compositions of convex functions and smooth maps

D Drusvyatskiy, C Paquette - Mathematical Programming, 2019 - Springer
We consider global efficiency of algorithms for minimizing a sum of a convex function and a
composition of a Lipschitz convex function with a smooth map. The basic algorithm we rely …

[КНИГА][B] Variational methods in imaging

O Scherzer, M Grasmair, H Grossauer, M Haltmeier… - 2009 - Springer
Imaging is an interdisciplinary research area with profound applications in many areas of
science, engineering, technology, and medicine. The most primitive form of imaging is visual …