The “Black-Box” Optimization Problem: Zero-Order Accelerated Stochastic Method via Kernel Approximation
In this paper, we study the standard formulation of an optimization problem when the
computation of gradient is not available. Such a problem can be classified as a “black box” …
computation of gradient is not available. Such a problem can be classified as a “black box” …
Accelerated zero-order sgd method for solving the black box optimization problem under “overparametrization” condition
This paper is devoted to solving a convex stochastic optimization problem in a
overparameterization setup for the case where the original gradient computation is not …
overparameterization setup for the case where the original gradient computation is not …
Stochastic adversarial noise in the “black box” optimization problem
A Lobanov - International Conference on Optimization and …, 2023 - Springer
This paper is devoted to the study of the solution of a stochastic convex black box
optimization problem. Where the black box problem means that the gradient-free oracle only …
optimization problem. Where the black box problem means that the gradient-free oracle only …
Acceleration exists! optimization problems when oracle can only compare objective function values
Frequently, the burgeoning field of black-box optimization encounters challenges due to a
limited understanding of the mechanisms of the objective function. To address such …
limited understanding of the mechanisms of the objective function. To address such …
Highly smooth zeroth-order methods for solving optimization problems under the PL condition
In this paper, we study the black box optimization problem under the Polyak–Lojasiewicz
(PL) condition, assuming that the objective function is not just smooth, but has higher …
(PL) condition, assuming that the objective function is not just smooth, but has higher …
Gradient-free algorithm for saddle point problems under overparametrization
This paper focuses on solving a stochastic saddle point problem (SPP) under an
overparameterized regime for the case, when the gradient computation is impractical. As an …
overparameterized regime for the case, when the gradient computation is impractical. As an …
The Order Oracle: a New Concept in The Black Box Optimization Problems
Frequently, the burgeoning field of black-box optimization encounters challenges due to a
limited understanding of the mechanisms of the objective function. In this paper, we provide …
limited understanding of the mechanisms of the objective function. In this paper, we provide …
Reduced Network Cumulative Constraint Violation for Distributed Bandit Convex Optimization under Slater Condition
This paper studies the distributed bandit convex optimization problem with time-varying
inequality constraints, where the goal is to minimize network regret and cumulative …
inequality constraints, where the goal is to minimize network regret and cumulative …
On quasi-convex smooth optimization problems by a comparison oracle
Frequently, when dealing with many machine learning models, optimization problems
appear to be challenging due to a limited understanding of the constructions and …
appear to be challenging due to a limited understanding of the constructions and …
Power of -Smoothness in Stochastic Convex Optimization: First- and Zero-Order Algorithms
This paper is devoted to the study of stochastic optimization problems under the generalized
smoothness assumption. By considering the unbiased gradient oracle in Stochastic Gradient …
smoothness assumption. By considering the unbiased gradient oracle in Stochastic Gradient …