Distributed learning in wireless networks: Recent progress and future challenges
The next-generation of wireless networks will enable many machine learning (ML) tools and
applications to efficiently analyze various types of data collected by edge devices for …
applications to efficiently analyze various types of data collected by edge devices for …
Communication-efficient distributed learning: An overview
Distributed learning is envisioned as the bedrock of next-generation intelligent networks,
where intelligent agents, such as mobile devices, robots, and sensors, exchange information …
where intelligent agents, such as mobile devices, robots, and sensors, exchange information …
Communication-efficient federated learning
Federated learning (FL) enables edge devices, such as Internet of Things devices (eg,
sensors), servers, and institutions (eg, hospitals), to collaboratively train a machine learning …
sensors), servers, and institutions (eg, hospitals), to collaboratively train a machine learning …
Decentralized federated averaging
Federated averaging (FedAvg) is a communication-efficient algorithm for distributed training
with an enormous number of clients. In FedAvg, clients keep their data locally for privacy …
with an enormous number of clients. In FedAvg, clients keep their data locally for privacy …
A survey of federated learning for edge computing: Research problems and solutions
Federated Learning is a machine learning scheme in which a shared prediction model can
be collaboratively learned by a number of distributed nodes using their locally stored data. It …
be collaboratively learned by a number of distributed nodes using their locally stored data. It …
Fairness-aware agnostic federated learning
Federated learning is an emerging framework that builds centralized machine learning
models with training data distributed across multiple devices. Most of the previous works …
models with training data distributed across multiple devices. Most of the previous works …
Adaptive quantization of model updates for communication-efficient federated learning
Communication of model updates between client nodes and the central aggregating server
is a major bottleneck in federated learning, especially in bandwidth-limited settings and high …
is a major bottleneck in federated learning, especially in bandwidth-limited settings and high …
Socialized learning: A survey of the paradigm shift for edge intelligence in networked systems
Amidst the robust impetus from artificial intelligence (AI) and big data, edge intelligence (EI)
has emerged as a nascent computing paradigm, synthesizing AI with edge computing (EC) …
has emerged as a nascent computing paradigm, synthesizing AI with edge computing (EC) …
Communication-efficient distributed deep learning: A comprehensive survey
Distributed deep learning (DL) has become prevalent in recent years to reduce training time
by leveraging multiple computing devices (eg, GPUs/TPUs) due to larger models and …
by leveraging multiple computing devices (eg, GPUs/TPUs) due to larger models and …
1-bit adam: Communication efficient large-scale training with adam's convergence speed
Scalable training of large models (like BERT and GPT-3) requires careful optimization
rooted in model design, architecture, and system capabilities. From a system standpoint …
rooted in model design, architecture, and system capabilities. From a system standpoint …