AsGrad: A sharp unified analysis of asynchronous-SGD algorithms
We analyze asynchronous-type algorithms for distributed SGD in the heterogeneous setting,
where each worker has its own computation and communication speeds, as well as data …
where each worker has its own computation and communication speeds, as well as data …
Noiseless privacy-preserving decentralized learning
Decentralized learning (DL) enables collaborative learning without a server and without
training data leaving the users' devices. However, the models shared in DL can still be used …
training data leaving the users' devices. However, the models shared in DL can still be used …
[HTML][HTML] Assessment of Water Hydrochemical Parameters Using Machine Learning Tools
I Malashin, V Nelyub, A Borodulin, A Gantimurov… - Sustainability, 2025 - mdpi.com
Access to clean water is a fundamental human need, yet millions of people worldwide still
lack access to safe drinking water. Traditional water quality assessments, though reliable …
lack access to safe drinking water. Traditional water quality assessments, though reliable …
Decentralized Sporadic Federated Learning: A Unified Methodology with Generalized Convergence Guarantees
Decentralized Federated Learning (DFL) has received significant recent research attention,
capturing settings where both model updates and model aggregations--the two key FL …
capturing settings where both model updates and model aggregations--the two key FL …
Boosting Asynchronous Decentralized Learning with Model Fragmentation
Decentralized learning (DL) is an emerging technique that allows nodes on the web to
collaboratively train machine learning models without sharing raw data. Dealing with …
collaboratively train machine learning models without sharing raw data. Dealing with …
Dual-Delayed Asynchronous SGD for Arbitrarily Heterogeneous Data
We consider the distributed learning problem with data dispersed across multiple workers
under the orchestration of a central server. Asynchronous Stochastic Gradient Descent …
under the orchestration of a central server. Asynchronous Stochastic Gradient Descent …
Asynchronous SGD with stale gradient dynamic adjustment for deep learning training
Asynchronous stochastic gradient descent (ASGD) is a computationally efficient algorithm,
which speeds up deep learning training and plays an important role in distributed deep …
which speeds up deep learning training and plays an important role in distributed deep …