A primer on zeroth-order optimization in signal processing and machine learning: Principals, recent advances, and applications
Zeroth-order (ZO) optimization is a subset of gradient-free optimization that emerges in many
signal processing and machine learning (ML) applications. It is used for solving optimization …
signal processing and machine learning (ML) applications. It is used for solving optimization …
Min-max optimization without gradients: Convergence and applications to black-box evasion and poisoning attacks
In this paper, we study the problem of constrained min-max optimization in a black-box
setting, where the desired optimizer cannot access the gradients of the objective function but …
setting, where the desired optimizer cannot access the gradients of the objective function but …
[PDF][PDF] Distributed learning of fully connected neural networks using independent subnet training
Distributed machine learning (ML) can bring more computational resources to bear than
single-machine learning, thus enabling reductions in training time. Distributed learning …
single-machine learning, thus enabling reductions in training time. Distributed learning …
The power of first-order smooth optimization for black-box non-smooth problems
A Gasnikov, A Novitskii, V Novitskii… - arxiv preprint arxiv …, 2022 - arxiv.org
Gradient-free/zeroth-order methods for black-box convex optimization have been
extensively studied in the last decade with the main focus on oracle calls complexity. In this …
extensively studied in the last decade with the main focus on oracle calls complexity. In this …
A unified solution for privacy and communication efficiency in vertical federated learning
Abstract Vertical Federated Learning (VFL) is a collaborative machine learning paradigm
that enables multiple participants to jointly train a model on their private data without sharing …
that enables multiple participants to jointly train a model on their private data without sharing …
Global convergence rate analysis of a generic line search algorithm with noise
In this paper, we develop convergence analysis of a modified line search method for
objective functions whose value is computed with noise and whose gradient estimates are …
objective functions whose value is computed with noise and whose gradient estimates are …
Finite difference gradient approximation: To randomize or not?
K Scheinberg - INFORMS Journal on Computing, 2022 - pubsonline.informs.org
We discuss two classes of methods of approximating gradients of noisy black box functions—
the classical finite difference method and recently popular randomized finite difference …
the classical finite difference method and recently popular randomized finite difference …
Direct training of snn using local zeroth order method
Spiking neural networks are becoming increasingly popular for their low energy requirement
in real-world tasks with accuracy comparable to traditional ANNs. SNN training algorithms …
in real-world tasks with accuracy comparable to traditional ANNs. SNN training algorithms …
Practical feature inference attack in vertical federated learning during prediction in artificial Internet of Things
R Yang, J Ma, J Zhang, S Kumari… - IEEE Internet of …, 2023 - ieeexplore.ieee.org
The emergence of edge computing guarantees the combination of the Internet of Things
(IoT) and artificial intelligence (AI). The vertical federated learning (VFL) framework, usually …
(IoT) and artificial intelligence (AI). The vertical federated learning (VFL) framework, usually …
On the numerical performance of finite-difference-based methods for derivative-free optimization
The goal of this paper is to investigate an approach for derivative-free optimization that has
not received sufficient attention in the literature and is yet one of the simplest to implement …
not received sufficient attention in the literature and is yet one of the simplest to implement …