Solving stochastic compositional optimization is nearly as easy as solving stochastic optimization

T Chen, Y Sun, W Yin - IEEE Transactions on Signal …, 2021 - ieeexplore.ieee.org
Stochastic compositional optimization generalizes classic (non-compositional) stochastic
optimization to the minimization of compositions of functions. Each composition may …

Hybrid variance-reduced sgd algorithms for minimax problems with nonconvex-linear function

Q Tran Dinh, D Liu, L Nguyen - Advances in Neural …, 2020 - proceedings.neurips.cc
We develop a novel and single-loop variance-reduced algorithm to solve a class of
stochastic nonconvex-convex minimax problems involving a nonconvex-linear objective …

Stochastic gauss-newton algorithms for nonconvex compositional optimization

Q Tran-Dinh, N Pham… - … Conference on Machine …, 2020 - proceedings.mlr.press
We develop two new stochastic Gauss-Newton algorithms for solving a class of non-convex
stochastic compositional optimization problems frequently arising in practice. We consider …

Momentum-based variance-reduced proximal stochastic gradient method for composite nonconvex stochastic optimization

Y Xu, Y Xu - Journal of Optimization Theory and Applications, 2023 - Springer
Stochastic gradient methods (SGMs) have been extensively used for solving stochastic
problems or large-scale machine learning problems. Recent works employ various …

Hybrid SGD algorithms to solve stochastic composite optimization problems with application in sparse portfolio selection problems

ZP Yang, Y Zhao - Journal of Computational and Applied Mathematics, 2024 - Elsevier
In this paper, we study stochastic composite problems where the objective can be the
composition of an outer single-valued function and an inner vector-valued map**. In this …

Adaptive primal-dual stochastic gradient method for expectation-constrained convex stochastic programs

Y Yan, Y Xu - Mathematical Programming Computation, 2022 - Springer
Stochastic gradient methods (SGMs) have been widely used for solving stochastic
optimization problems. A majority of existing works assume no constraints or easy-to-project …

Streamlining in the Riemannian Realm: Efficient Riemannian Optimization with Loopless Variance Reduction

Y Demidovich, G Malinovsky, P Richtárik - arxiv preprint arxiv:2403.06677, 2024 - arxiv.org
In this study, we investigate stochastic optimization on Riemannian manifolds, focusing on
the crucial variance reduction mechanism used in both Euclidean and Riemannian settings …

Hybrid variance-reduced sgd algorithms for nonconvex-concave minimax problems

Q Tran-Dinh, D Liu, LM Nguyen - arxiv preprint arxiv:2006.15266, 2020 - arxiv.org
We develop a novel and single-loop variance-reduced algorithm to solve a class of
stochastic nonconvex-convex minimax problems involving a nonconvex-linear objective …

Linearly-convergent FISTA variant for composite optimization with duality

C Garner, S Zhang - Journal of Scientific Computing, 2023 - Springer
Many large-scale optimization problems can be expressed as composite optimization
models. Accelerated first-order methods such as the fast iterative shrinkage–thresholding …

Riemannian Stochastic Gradient Method for Nested Composition Optimization

D Zhang, SD Tajbakhsh - arxiv preprint arxiv:2207.09350, 2022 - arxiv.org
This work considers optimization of composition of functions in a nested form over
Riemannian manifolds where each function contains an expectation. This type of problems …