A primer on zeroth-order optimization in signal processing and machine learning: Principals, recent advances, and applications

S Liu, PY Chen, B Kailkhura, G Zhang… - IEEE Signal …, 2020 - ieeexplore.ieee.org
Zeroth-order (ZO) optimization is a subset of gradient-free optimization that emerges in many
signal processing and machine learning (ML) applications. It is used for solving optimization …

Min-max optimization without gradients: Convergence and applications to black-box evasion and poisoning attacks

S Liu, S Lu, X Chen, Y Feng, K Xu… - International …, 2020 - proceedings.mlr.press
In this paper, we study the problem of constrained min-max optimization in a black-box
setting, where the desired optimizer cannot access the gradients of the objective function but …

[PDF][PDF] Distributed learning of fully connected neural networks using independent subnet training

B Yuan, CR Wolfe, C Dun, Y Tang, A Kyrillidis… - Proceedings of the …, 2022 - par.nsf.gov
Distributed machine learning (ML) can bring more computational resources to bear than
single-machine learning, thus enabling reductions in training time. Distributed learning …

The power of first-order smooth optimization for black-box non-smooth problems

A Gasnikov, A Novitskii, V Novitskii… - arxiv preprint arxiv …, 2022 - arxiv.org
Gradient-free/zeroth-order methods for black-box convex optimization have been
extensively studied in the last decade with the main focus on oracle calls complexity. In this …

A unified solution for privacy and communication efficiency in vertical federated learning

G Wang, B Gu, Q Zhang, X Li… - Advances in Neural …, 2024 - proceedings.neurips.cc
Abstract Vertical Federated Learning (VFL) is a collaborative machine learning paradigm
that enables multiple participants to jointly train a model on their private data without sharing …

Global convergence rate analysis of a generic line search algorithm with noise

AS Berahas, L Cao, K Scheinberg - SIAM Journal on Optimization, 2021 - SIAM
In this paper, we develop convergence analysis of a modified line search method for
objective functions whose value is computed with noise and whose gradient estimates are …

Finite difference gradient approximation: To randomize or not?

K Scheinberg - INFORMS Journal on Computing, 2022 - pubsonline.informs.org
We discuss two classes of methods of approximating gradients of noisy black box functions—
the classical finite difference method and recently popular randomized finite difference …

Direct training of snn using local zeroth order method

B Mukhoty, V Bojkovic, W de Vazelhes… - Advances in …, 2023 - proceedings.neurips.cc
Spiking neural networks are becoming increasingly popular for their low energy requirement
in real-world tasks with accuracy comparable to traditional ANNs. SNN training algorithms …

Practical feature inference attack in vertical federated learning during prediction in artificial Internet of Things

R Yang, J Ma, J Zhang, S Kumari… - IEEE Internet of …, 2023 - ieeexplore.ieee.org
The emergence of edge computing guarantees the combination of the Internet of Things
(IoT) and artificial intelligence (AI). The vertical federated learning (VFL) framework, usually …

On the numerical performance of finite-difference-based methods for derivative-free optimization

HJM Shi, M Qiming Xuan, F Oztoprak… - … Methods and Software, 2023 - Taylor & Francis
The goal of this paper is to investigate an approach for derivative-free optimization that has
not received sufficient attention in the literature and is yet one of the simplest to implement …