A forward-backward splitting method for monotone inclusions without cocoercivity
Y Malitsky, MK Tam - SIAM Journal on Optimization, 2020 - SIAM
In this work, we propose a simple modification of the forward-backward splitting method for
finding a zero in the sum of two monotone operators. Our method converges under the same …
finding a zero in the sum of two monotone operators. Our method converges under the same …
Accelerated Algorithms for Smooth Convex-Concave Minimax Problems with O (1/k^ 2) Rate on Squared Gradient Norm
In this work, we study the computational complexity of reducing the squared gradient
magnitude for smooth minimax optimization problems. First, we present algorithms with …
magnitude for smooth minimax optimization problems. First, we present algorithms with …
[HTML][HTML] Convergence of sequences: A survey
B Franci, S Grammatico - Annual Reviews in Control, 2022 - Elsevier
Convergent sequences of real numbers play a fundamental role in many different problems
in system theory, eg, in Lyapunov stability analysis, as well as in optimization theory and …
in system theory, eg, in Lyapunov stability analysis, as well as in optimization theory and …
Operator splitting performance estimation: Tight contraction factors and optimal parameter selection
We propose a methodology for studying the performance of common splitting methods
through semidefinite programming. We prove tightness of the methodology and demonstrate …
through semidefinite programming. We prove tightness of the methodology and demonstrate …
Solving nonconvex-nonconcave min-max problems exhibiting weak minty solutions
A Böhm - ar** in the continuous version of Nesterov's accelerated
gradient method provides, by temporal discretization, fast proximal gradient algorithms …
gradient method provides, by temporal discretization, fast proximal gradient algorithms …
Accelerated minimax algorithms flock together
Several new accelerated methods in minimax optimization and fixed-point iterations have
recently been discovered, and, interestingly, they rely on a mechanism distinct from …
recently been discovered, and, interestingly, they rely on a mechanism distinct from …
Two Steps at a Time---Taking GAN Training in Stride with Tseng's Method
Motivated by the training of generative adversarial networks (GANs), we study methods for
solving minimax problems with additional nonsmooth regularizers. We do so by employing …
solving minimax problems with additional nonsmooth regularizers. We do so by employing …