Federated minimax optimization: Improved convergence analyses and algorithms

P Sharma, R Panda, G Joshi… - … on Machine Learning, 2022 - proceedings.mlr.press
In this paper, we consider nonconvex minimax optimization, which is gaining prominence in
many modern machine learning applications, such as GANs. Large-scale edge-based …

Stochastic gradient descent-ascent: Unified theory and new efficient methods

A Beznosikov, E Gorbunov… - International …, 2023 - proceedings.mlr.press
Abstract Stochastic Gradient Descent-Ascent (SGDA) is one of the most prominent
algorithms for solving min-max optimization and variational inequalities problems (VIP) …

Variance reduction is an antidote to byzantines: Better rates, weaker assumptions and communication compression as a cherry on the top

E Gorbunov, S Horváth, P Richtárik, G Gidel - arxiv preprint arxiv …, 2022 - arxiv.org
Byzantine-robustness has been gaining a lot of attention due to the growth of the interest in
collaborative and federated learning. However, many fruitful directions, such as the usage of …

Communication compression for byzantine robust learning: New efficient algorithms and improved rates

A Rammal, K Gruntkowska, N Fedin… - International …, 2024 - proceedings.mlr.press
Byzantine robustness is an essential feature of algorithms for certain distributed optimization
problems, typically encountered in collaborative/federated learning. These problems are …

Federated minimax optimization with client heterogeneity

P Sharma, R Panda, G Joshi - arxiv preprint arxiv:2302.04249, 2023 - arxiv.org
Minimax optimization has seen a surge in interest with the advent of modern applications
such as GANs, and it is inherently more challenging than simple minimization. The difficulty …

Similarity, compression and local steps: three pillars of efficient communications for distributed variational inequalities

A Beznosikov, M Takác… - Advances in Neural …, 2024 - proceedings.neurips.cc
Variational inequalities are a broad and flexible class of problems that includes
minimization, saddle point, and fixed point problems as special cases. Therefore, variational …

Compression and data similarity: Combination of two techniques for communication-efficient solving of distributed variational inequalities

A Beznosikov, A Gasnikov - International Conference on Optimization and …, 2022 - Springer
Variational inequalities are an important tool, which includes minimization, saddles, games,
fixed-point problems. Modern large-scale and computationally expensive practical …

Fixed-time neurodynamic optimization approach with time-varying coefficients to variational inequality problems and applications

X Ju, X Yang, S Yuan, DWC Ho - Communications in Nonlinear Science …, 2025 - Elsevier
The article presents a novel fixed-time (FT) neurodynamic optimization approach featuring
time-varying coefficients, tailored for variational inequality problems (VIPs). This method …

Effective Method with Compression for Distributed and Federated Cocoercive Variational Inequalities

D Medyakov, G Molodtsov, A Beznosikov - arxiv preprint arxiv …, 2024 - arxiv.org
Variational inequalities as an effective tool for solving applied problems, including machine
learning tasks, have been attracting more and more attention from researchers in recent …

Near-Optimal Distributed Minimax Optimization under the Second-Order Similarity

Q Zhou, H Ye, L Luo - arxiv preprint arxiv:2405.16126, 2024 - arxiv.org
This paper considers the distributed convex-concave minimax optimization under the
second-order similarity. We propose stochastic variance-reduced optimistic gradient sliding …