Fednest: Federated bilevel, minimax, and compositional optimization

DA Tarzanagh, M Li… - … on Machine Learning, 2022 - proceedings.mlr.press
Standard federated optimization methods successfully apply to stochastic problems with
single-level structure. However, many contemporary ML problems-including adversarial …

Resource allocation in heterogeneously-distributed joint radar-communications under asynchronous Bayesian tracking framework

L Wu, KV Mishra, MRB Shankar… - IEEE Journal on …, 2022 - ieeexplore.ieee.org
Optimal allocation of shared resources is key to deliver the promise of jointly operating radar
and communications systems. In this paper, unlike prior works which examine synergistic …

Decentralized local stochastic extra-gradient for variational inequalities

A Beznosikov, P Dvurechenskii… - Advances in …, 2022 - proceedings.neurips.cc
We consider distributed stochastic variational inequalities (VIs) on unbounded domains with
the problem data that is heterogeneous (non-IID) and distributed across many devices. We …

Distributed saddle-point problems: Lower bounds, near-optimal and robust algorithms

A Beznosikov, V Samokhin, A Gasnikov - arxiv preprint arxiv:2010.13112, 2020 - arxiv.org
This paper focuses on the distributed optimization of stochastic saddle point problems. The
first part of the paper is devoted to lower bounds for the cenralized and decentralized …

Nonconvex-nonconcave min-max optimization with a small maximization domain

DM Ostrovskii, B Barazandeh, M Razaviyayn - arxiv preprint arxiv …, 2021 - arxiv.org
We study the problem of finding approximate first-order stationary points in optimization
problems of the form $\min_ {x\in X}\max_ {y\in Y} f (x, y) $, where the sets $ X, Y $ are …

Dissecting adaptive methods in GANs

S Jelassi, D Dobre, A Mensch, Y Li, G Gidel - arxiv preprint arxiv …, 2022 - arxiv.org
Adaptive methods are a crucial component widely used for training generative adversarial
networks (GANs). While there has been some work to pinpoint the" marginal value of …

Adam is no better than normalized SGD: Dissecting how adaptivity improves GAN performance

S Jelassi, A Mensch, G Gidel, Y Li - 2021 - openreview.net
Adaptive methods are widely used for training generative adversarial networks (GAN). While
there has been some work to pinpoint the marginal value of adaptive methods in …

A decentralized adaptive momentum method for solving a class of min-max optimization problems

B Barazandeh, T Huang, G Michailidis - Signal Processing, 2021 - Elsevier
Min-max saddle point games have recently been intensely studied, due to their wide range
of applications, including training Generative Adversarial Networks (GANs). However, most …

Accelerated Stochastic Min-Max Optimization Based on Bias-corrected Momentum

H Cai, SA Alghunaim, AH Sayed - arxiv preprint arxiv:2406.13041, 2024 - arxiv.org
Lower-bound analyses for nonconvex strongly-concave minimax optimization problems
have shown that stochastic first-order algorithms require at least $\mathcal {O}(\varepsilon …

Adaptive step-size methods for compressed sgd

AM Subramaniam, A Magesh… - ICASSP 2023-2023 …, 2023 - ieeexplore.ieee.org
Compressed Stochastic Gradient Descent (SGD) algorithms have been proposed to address
the communication bottleneck in distributed and decentralized optimization problems such …