The evolution of distributed systems for graph neural networks and their origin in graph processing and deep learning: A survey
Graph neural networks (GNNs) are an emerging research field. This specialized deep
neural network architecture is capable of processing graph structured data and bridges the …
neural network architecture is capable of processing graph structured data and bridges the …
A field guide to federated optimization
Federated learning and analytics are a distributed approach for collaboratively learning
models (or statistics) from decentralized data, motivated by and designed for privacy …
models (or statistics) from decentralized data, motivated by and designed for privacy …
Federated learning with buffered asynchronous aggregation
Scalability and privacy are two critical concerns for cross-device federated learning (FL)
systems. In this work, we identify that synchronous FL–cannot scale efficiently beyond a few …
systems. In this work, we identify that synchronous FL–cannot scale efficiently beyond a few …
Advances and open problems in federated learning
Federated learning (FL) is a machine learning setting where many clients (eg, mobile
devices or whole organizations) collaboratively train a model under the orchestration of a …
devices or whole organizations) collaboratively train a model under the orchestration of a …
Freelb: Enhanced adversarial training for natural language understanding
Adversarial training, which minimizes the maximal risk for label-preserving input
perturbations, has proved to be effective for improving the generalization of language …
perturbations, has proved to be effective for improving the generalization of language …
Machine learning at the wireless edge: Distributed stochastic gradient descent over-the-air
We study federated machine learning (ML) at the wireless edge, where power-and
bandwidth-limited wireless devices with local datasets carry out distributed stochastic …
bandwidth-limited wireless devices with local datasets carry out distributed stochastic …
Sharper convergence guarantees for asynchronous SGD for distributed and federated learning
We study the asynchronous stochastic gradient descent algorithm, for distributed training
over $ n $ workers that might be heterogeneous. In this algorithm, workers compute …
over $ n $ workers that might be heterogeneous. In this algorithm, workers compute …
SAFA: A semi-asynchronous protocol for fast federated learning with low overhead
Federated learning (FL) has attracted increasing attention as a promising approach to
driving a vast number of end devices with artificial intelligence. However, it is very …
driving a vast number of end devices with artificial intelligence. However, it is very …
Cooperative SGD: A unified framework for the design and analysis of communication-efficient SGD algorithms
Communication-efficient SGD algorithms, which allow nodes to perform local updates and
periodically synchronize local models, are highly effective in improving the speed and …
periodically synchronize local models, are highly effective in improving the speed and …
Cooperative SGD: A unified framework for the design and analysis of local-update SGD algorithms
When training machine learning models using stochastic gradient descent (SGD) with a
large number of nodes or massive edge devices, the communication cost of synchronizing …
large number of nodes or massive edge devices, the communication cost of synchronizing …