Stochastic distributed optimization under average second-order similarity: Algorithms and analysis
We study finite-sum distributed optimization problems involving a master node and $ n-1$
local nodes under the popular $\delta $-similarity and $\mu $-strong convexity conditions …
local nodes under the popular $\delta $-similarity and $\mu $-strong convexity conditions …
Smooth monotone stochastic variational inequalities and saddle point problems: A survey
This paper is a survey of methods for solving smooth,(strongly) monotone stochastic
variational inequalities. To begin with, we present the deterministic foundation from which …
variational inequalities. To begin with, we present the deterministic foundation from which …
Two losses are better than one: Faster optimization using a cheaper proxy
We present an algorithm for minimizing an objective with hard-to-compute gradients by
using a related, easier-to-access function as a proxy. Our algorithm is based on approximate …
using a related, easier-to-access function as a proxy. Our algorithm is based on approximate …
Method with batching for stochastic finite-sum variational inequalities in non-Euclidean setting
A Pichugin, M Pechin, A Beznosikov, V Novitskii… - Chaos, Solitons & …, 2024 - Elsevier
Variational inequalities are a universal optimization paradigm that incorporate classical
minimization and saddle point problems. Nowadays more and more tasks require to …
minimization and saddle point problems. Nowadays more and more tasks require to …
Faster federated optimization under second-order similarity
A Khaled, C ** - ar** efficient optimization algorithms, it is crucial to account for communication
constraints--a significant challenge in modern Federated Learning. The best-known …
constraints--a significant challenge in modern Federated Learning. The best-known …
Similarity, compression and local steps: three pillars of efficient communications for distributed variational inequalities
Variational inequalities are a broad and flexible class of problems that includes
minimization, saddle point, and fixed point problems as special cases. Therefore, variational …
minimization, saddle point, and fixed point problems as special cases. Therefore, variational …
Stochastic proximal point methods for monotone inclusions under expected similarity
Monotone inclusions have a wide range of applications, including minimization, saddle-
point, and equilibria problems. We introduce new stochastic algorithms, with or without …
point, and equilibria problems. We introduce new stochastic algorithms, with or without …
Local methods with adaptivity via scaling
The rapid development of machine learning and deep learning has introduced increasingly
complex optimization challenges that must be addressed. Indeed, training modern …
complex optimization challenges that must be addressed. Indeed, training modern …
Accelerated Stochastic ExtraGradient: Mixing Hessian and gradient similarity to reduce communication in distributed and federated learning
Modern realities and trends in learning require more and more generalization ability of
models, which leads to an increase in both models and training sample size. It is already …
models, which leads to an increase in both models and training sample size. It is already …