[HTML][HTML] Federated learning for 6G: Applications, challenges, and opportunities
Standard machine-learning approaches involve the centralization of training data in a data
center, where centralized machine-learning algorithms can be applied for data analysis and …
center, where centralized machine-learning algorithms can be applied for data analysis and …
Variance reduced proxskip: Algorithm, theory and application to federated learning
We study distributed optimization methods based on the {\em local training (LT)} paradigm,
ie, methods which achieve communication efficiency by performing richer local gradient …
ie, methods which achieve communication efficiency by performing richer local gradient …
Feddc: Federated learning with non-iid data via local drift decoupling and correction
Federated learning (FL) allows multiple clients to collectively train a high-performance
global model without sharing their private data. However, the key challenge in federated …
global model without sharing their private data. However, the key challenge in federated …
Fine-tuning global model via data-free knowledge distillation for non-iid federated learning
Federated Learning (FL) is an emerging distributed learning paradigm under privacy
constraint. Data heterogeneity is one of the main challenges in FL, which results in slow …
constraint. Data heterogeneity is one of the main challenges in FL, which results in slow …
Towards understanding biased client selection in federated learning
Federated learning is a distributed optimization paradigm that enables a large number of
resource-limited client nodes to cooperatively train a model without data sharing. Previous …
resource-limited client nodes to cooperatively train a model without data sharing. Previous …
Personalized federated learning with theoretical guarantees: A model-agnostic meta-learning approach
Abstract In Federated Learning, we aim to train models across multiple computing units
(users), while users can only communicate with a common central server, without …
(users), while users can only communicate with a common central server, without …
Federated learning based on dynamic regularization
We propose a novel federated learning method for distributively training neural network
models, where the server orchestrates cooperation between a subset of randomly chosen …
models, where the server orchestrates cooperation between a subset of randomly chosen …
Tackling the objective inconsistency problem in heterogeneous federated optimization
In federated learning, heterogeneity in the clients' local datasets and computation speeds
results in large variations in the number of local updates performed by each client in each …
results in large variations in the number of local updates performed by each client in each …
Proxskip: Yes! local gradient steps provably lead to communication acceleration! finally!
We introduce ProxSkip—a surprisingly simple and provably efficient method for minimizing
the sum of a smooth ($ f $) and an expensive nonsmooth proximable ($\psi $) function. The …
the sum of a smooth ($ f $) and an expensive nonsmooth proximable ($\psi $) function. The …
Personalized federated learning with moreau envelopes
Federated learning (FL) is a decentralized and privacy-preserving machine learning
technique in which a group of clients collaborate with a server to learn a global model …
technique in which a group of clients collaborate with a server to learn a global model …