Federated minimax optimization: Improved convergence analyses and algorithms
In this paper, we consider nonconvex minimax optimization, which is gaining prominence in
many modern machine learning applications, such as GANs. Large-scale edge-based …
many modern machine learning applications, such as GANs. Large-scale edge-based …
Stochastic gradient descent-ascent: Unified theory and new efficient methods
Abstract Stochastic Gradient Descent-Ascent (SGDA) is one of the most prominent
algorithms for solving min-max optimization and variational inequalities problems (VIP) …
algorithms for solving min-max optimization and variational inequalities problems (VIP) …
Variance reduction is an antidote to byzantines: Better rates, weaker assumptions and communication compression as a cherry on the top
Byzantine-robustness has been gaining a lot of attention due to the growth of the interest in
collaborative and federated learning. However, many fruitful directions, such as the usage of …
collaborative and federated learning. However, many fruitful directions, such as the usage of …
Communication compression for byzantine robust learning: New efficient algorithms and improved rates
Byzantine robustness is an essential feature of algorithms for certain distributed optimization
problems, typically encountered in collaborative/federated learning. These problems are …
problems, typically encountered in collaborative/federated learning. These problems are …
Federated minimax optimization with client heterogeneity
Minimax optimization has seen a surge in interest with the advent of modern applications
such as GANs, and it is inherently more challenging than simple minimization. The difficulty …
such as GANs, and it is inherently more challenging than simple minimization. The difficulty …
Similarity, compression and local steps: three pillars of efficient communications for distributed variational inequalities
Variational inequalities are a broad and flexible class of problems that includes
minimization, saddle point, and fixed point problems as special cases. Therefore, variational …
minimization, saddle point, and fixed point problems as special cases. Therefore, variational …
Compression and data similarity: Combination of two techniques for communication-efficient solving of distributed variational inequalities
Variational inequalities are an important tool, which includes minimization, saddles, games,
fixed-point problems. Modern large-scale and computationally expensive practical …
fixed-point problems. Modern large-scale and computationally expensive practical …
Fixed-time neurodynamic optimization approach with time-varying coefficients to variational inequality problems and applications
X Ju, X Yang, S Yuan, DWC Ho - Communications in Nonlinear Science …, 2025 - Elsevier
The article presents a novel fixed-time (FT) neurodynamic optimization approach featuring
time-varying coefficients, tailored for variational inequality problems (VIPs). This method …
time-varying coefficients, tailored for variational inequality problems (VIPs). This method …
Effective Method with Compression for Distributed and Federated Cocoercive Variational Inequalities
Variational inequalities as an effective tool for solving applied problems, including machine
learning tasks, have been attracting more and more attention from researchers in recent …
learning tasks, have been attracting more and more attention from researchers in recent …
Near-Optimal Distributed Minimax Optimization under the Second-Order Similarity
This paper considers the distributed convex-concave minimax optimization under the
second-order similarity. We propose stochastic variance-reduced optimistic gradient sliding …
second-order similarity. We propose stochastic variance-reduced optimistic gradient sliding …