The “Black-Box” Optimization Problem: Zero-Order Accelerated Stochastic Method via Kernel Approximation

A Lobanov, N Bashirov, A Gasnikov - Journal of Optimization Theory and …, 2024 - Springer
In this paper, we study the standard formulation of an optimization problem when the
computation of gradient is not available. Such a problem can be classified as a “black box” …

Accelerated zero-order sgd method for solving the black box optimization problem under “overparametrization” condition

A Lobanov, A Gasnikov - International Conference on Optimization and …, 2023 - Springer
This paper is devoted to solving a convex stochastic optimization problem in a
overparameterization setup for the case where the original gradient computation is not …

Stochastic adversarial noise in the “black box” optimization problem

A Lobanov - International Conference on Optimization and …, 2023 - Springer
This paper is devoted to the study of the solution of a stochastic convex black box
optimization problem. Where the black box problem means that the gradient-free oracle only …

Acceleration exists! optimization problems when oracle can only compare objective function values

A Lobanov, A Gasnikov, A Krasnov - The Thirty-eighth Annual …, 2024 - openreview.net
Frequently, the burgeoning field of black-box optimization encounters challenges due to a
limited understanding of the mechanisms of the objective function. To address such …

Highly smooth zeroth-order methods for solving optimization problems under the PL condition

AV Gasnikov, AV Lobanov, FS Stonyakin - … Mathematics and Mathematical …, 2024 - Springer
In this paper, we study the black box optimization problem under the Polyak–Lojasiewicz
(PL) condition, assuming that the objective function is not just smooth, but has higher …

Gradient-free algorithm for saddle point problems under overparametrization

E Statkevich, S Bondar, D Dvinskikh, A Gasnikov… - Chaos, Solitons & …, 2024 - Elsevier
This paper focuses on solving a stochastic saddle point problem (SPP) under an
overparameterized regime for the case, when the gradient computation is impractical. As an …

The Order Oracle: a New Concept in The Black Box Optimization Problems

A Lobanov, A Gasnikov, A Krasnov - arxiv preprint arxiv:2402.09014, 2024 - arxiv.org
Frequently, the burgeoning field of black-box optimization encounters challenges due to a
limited understanding of the mechanisms of the objective function. In this paper, we provide …

Reduced Network Cumulative Constraint Violation for Distributed Bandit Convex Optimization under Slater Condition

K Zhang, X Yi, J Ding, M Cao, KH Johansson… - arxiv preprint arxiv …, 2024 - arxiv.org
This paper studies the distributed bandit convex optimization problem with time-varying
inequality constraints, where the goal is to minimize network regret and cumulative …

On quasi-convex smooth optimization problems by a comparison oracle

AV Gasnikov, MS Alkousa, AV Lobanov… - arxiv preprint arxiv …, 2024 - arxiv.org
Frequently, when dealing with many machine learning models, optimization problems
appear to be challenging due to a limited understanding of the constructions and …

Power of -Smoothness in Stochastic Convex Optimization: First- and Zero-Order Algorithms

A Lobanov, A Gasnikov - arxiv preprint arxiv:2501.18198, 2025 - arxiv.org
This paper is devoted to the study of stochastic optimization problems under the generalized
smoothness assumption. By considering the unbiased gradient oracle in Stochastic Gradient …