AsGrad: A sharp unified analysis of asynchronous-SGD algorithms

R Islamov, M Safaryan… - … Conference on Artificial …, 2024 - proceedings.mlr.press
We analyze asynchronous-type algorithms for distributed SGD in the heterogeneous setting,
where each worker has its own computation and communication speeds, as well as data …

Noiseless privacy-preserving decentralized learning

S Biswas, M Even, AM Kermarrec… - The 25th Privacy …, 2024 - hal.science
Decentralized learning (DL) enables collaborative learning without a server and without
training data leaving the users' devices. However, the models shared in DL can still be used …

[HTML][HTML] Assessment of Water Hydrochemical Parameters Using Machine Learning Tools

I Malashin, V Nelyub, A Borodulin, A Gantimurov… - Sustainability, 2025 - mdpi.com
Access to clean water is a fundamental human need, yet millions of people worldwide still
lack access to safe drinking water. Traditional water quality assessments, though reliable …

Decentralized Sporadic Federated Learning: A Unified Methodology with Generalized Convergence Guarantees

S Zehtabi, DJ Han, R Parasnis… - arxiv preprint arxiv …, 2024 - arxiv.org
Decentralized Federated Learning (DFL) has received significant recent research attention,
capturing settings where both model updates and model aggregations--the two key FL …

Boosting Asynchronous Decentralized Learning with Model Fragmentation

S Biswas, AM Kermarrec, A Marouani, R Pires… - arxiv preprint arxiv …, 2024 - arxiv.org
Decentralized learning (DL) is an emerging technique that allows nodes on the web to
collaboratively train machine learning models without sharing raw data. Dealing with …

Dual-Delayed Asynchronous SGD for Arbitrarily Heterogeneous Data

X Wang, Y Sun, HT Wai, J Zhang - arxiv preprint arxiv:2405.16966, 2024 - arxiv.org
We consider the distributed learning problem with data dispersed across multiple workers
under the orchestration of a central server. Asynchronous Stochastic Gradient Descent …

Asynchronous SGD with stale gradient dynamic adjustment for deep learning training

T Tan, H **e, Y **a, X Shi, M Shang - Information Sciences, 2024 - Elsevier
Asynchronous stochastic gradient descent (ASGD) is a computationally efficient algorithm,
which speeds up deep learning training and plays an important role in distributed deep …