Solving stochastic compositional optimization is nearly as easy as solving stochastic optimization
Stochastic compositional optimization generalizes classic (non-compositional) stochastic
optimization to the minimization of compositions of functions. Each composition may …
optimization to the minimization of compositions of functions. Each composition may …
Hybrid variance-reduced sgd algorithms for minimax problems with nonconvex-linear function
We develop a novel and single-loop variance-reduced algorithm to solve a class of
stochastic nonconvex-convex minimax problems involving a nonconvex-linear objective …
stochastic nonconvex-convex minimax problems involving a nonconvex-linear objective …
Stochastic gauss-newton algorithms for nonconvex compositional optimization
Q Tran-Dinh, N Pham… - … Conference on Machine …, 2020 - proceedings.mlr.press
We develop two new stochastic Gauss-Newton algorithms for solving a class of non-convex
stochastic compositional optimization problems frequently arising in practice. We consider …
stochastic compositional optimization problems frequently arising in practice. We consider …
Momentum-based variance-reduced proximal stochastic gradient method for composite nonconvex stochastic optimization
Stochastic gradient methods (SGMs) have been extensively used for solving stochastic
problems or large-scale machine learning problems. Recent works employ various …
problems or large-scale machine learning problems. Recent works employ various …
Hybrid SGD algorithms to solve stochastic composite optimization problems with application in sparse portfolio selection problems
ZP Yang, Y Zhao - Journal of Computational and Applied Mathematics, 2024 - Elsevier
In this paper, we study stochastic composite problems where the objective can be the
composition of an outer single-valued function and an inner vector-valued map**. In this …
composition of an outer single-valued function and an inner vector-valued map**. In this …
Adaptive primal-dual stochastic gradient method for expectation-constrained convex stochastic programs
Stochastic gradient methods (SGMs) have been widely used for solving stochastic
optimization problems. A majority of existing works assume no constraints or easy-to-project …
optimization problems. A majority of existing works assume no constraints or easy-to-project …
Streamlining in the Riemannian Realm: Efficient Riemannian Optimization with Loopless Variance Reduction
In this study, we investigate stochastic optimization on Riemannian manifolds, focusing on
the crucial variance reduction mechanism used in both Euclidean and Riemannian settings …
the crucial variance reduction mechanism used in both Euclidean and Riemannian settings …
Hybrid variance-reduced sgd algorithms for nonconvex-concave minimax problems
We develop a novel and single-loop variance-reduced algorithm to solve a class of
stochastic nonconvex-convex minimax problems involving a nonconvex-linear objective …
stochastic nonconvex-convex minimax problems involving a nonconvex-linear objective …
Linearly-convergent FISTA variant for composite optimization with duality
Many large-scale optimization problems can be expressed as composite optimization
models. Accelerated first-order methods such as the fast iterative shrinkage–thresholding …
models. Accelerated first-order methods such as the fast iterative shrinkage–thresholding …
Riemannian Stochastic Gradient Method for Nested Composition Optimization
D Zhang, SD Tajbakhsh - arxiv preprint arxiv:2207.09350, 2022 - arxiv.org
This work considers optimization of composition of functions in a nested form over
Riemannian manifolds where each function contains an expectation. This type of problems …
Riemannian manifolds where each function contains an expectation. This type of problems …