Stochastic model-based minimization of weakly convex functions

D Davis, D Drusvyatskiy - SIAM Journal on Optimization, 2019 - SIAM
We consider a family of algorithms that successively sample and minimize simple stochastic
models of the objective function. We show that under reasonable conditions on …

Weakly-convex–concave min–max optimization: provable algorithms and applications in machine learning

H Rafique, M Liu, Q Lin, T Yang - Optimization Methods and …, 2022 - Taylor & Francis
Min–max problems have broad applications in machine learning, including learning with
non-decomposable loss and learning with robustness to data distribution. Convex–concave …

Stochastic methods for composite and weakly convex optimization problems

JC Duchi, F Ruan - SIAM Journal on Optimization, 2018 - SIAM
We consider minimization of stochastic functionals that are compositions of a (potentially)
nonsmooth convex function h and smooth function c and, more generally, stochastic weakly …

Biased stochastic first-order methods for conditional stochastic optimization and applications in meta learning

Y Hu, S Zhang, X Chen, N He - Advances in Neural …, 2020 - proceedings.neurips.cc
Conditional stochastic optimization covers a variety of applications ranging from invariant
learning and causal inference to meta-learning. However, constructing unbiased gradient …

The importance of better models in stochastic optimization

H Asi, JC Duchi - Proceedings of the National Academy of …, 2019 - National Acad Sciences
Standard stochastic optimization methods are brittle, sensitive to stepsize choice and other
algorithmic parameters, and they exhibit instability outside of well-behaved families of …

Subgradient methods for sharp weakly convex functions

D Davis, D Drusvyatskiy, KJ MacPhee… - Journal of Optimization …, 2018 - Springer
Subgradient methods converge linearly on a convex function that grows sharply away from
its solution set. In this work, we show that the same is true for sharp functions that are only …

Weakly convex optimization over Stiefel manifold using Riemannian subgradient-type methods

X Li, S Chen, Z Deng, Q Qu, Z Zhu… - SIAM Journal on …, 2021 - SIAM
We consider a class of nonsmooth optimization problems over the Stiefel manifold, in which
the objective function is weakly convex in the ambient Euclidean space. Such problems are …

Stochastic subgradient method converges at the rate on weakly convex functions

D Davis, D Drusvyatskiy - arxiv preprint arxiv:1802.02988, 2018 - arxiv.org
We prove that the proximal stochastic subgradient method, applied to a weakly convex
problem, drives the gradient of the Moreau envelope to zero at the rate $ O (k^{-1/4}) $. As a …

Convergence of a stochastic gradient method with momentum for non-smooth non-convex optimization

V Mai, M Johansson - International conference on machine …, 2020 - proceedings.mlr.press
Stochastic gradient methods with momentum are widely used in applications and at the core
of optimization subroutines in many popular machine learning libraries. However, their …

Stochastic bias-reduced gradient methods

H Asi, Y Carmon, A Jambulapati… - Advances in Neural …, 2021 - proceedings.neurips.cc
We develop a new primitive for stochastic optimization: a low-bias, low-cost estimator of the
minimizer $ x_\star $ of any Lipschitz strongly-convex function $ f $. In particular, we use a …