Fedict: Federated multi-task distillation for multi-access edge computing

Z Wu, S Sun, Y Wang, M Liu, Q Pan… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
The growing interest in intelligent services and privacy protection for mobile devices has
given rise to the widespread application of federated learning in Multi-access Edge …

Federated Learning on Non-iid Data via Local and Global Distillation

X Zheng, S Ying, F Zheng, J Yin… - … Conference on Web …, 2023 - ieeexplore.ieee.org
Most existing federated learning algorithms are based on the vanilla FedAvg scheme.
However, with the increase of data complexity and the number of model parameters, the …

[HTML][HTML] FedTKD: A Trustworthy Heterogeneous Federated Learning Based on Adaptive Knowledge Distillation

L Chen, W Zhang, C Dong, D Zhao, X Zeng, S Qiao… - Entropy, 2024 - mdpi.com
Federated learning allows multiple parties to train models while jointly protecting user
privacy. However, traditional federated learning requires each client to have the same model …

Communication efficient federated learning via channel-wise dynamic pruning

B Tao, C Chen, H Chen - 2023 IEEE Wireless Communications …, 2023 - ieeexplore.ieee.org
Federated Learning (FL) received widespread attention in 5G mobile edge networks (MENs)
as it enables collaborative training deep learning models without disclosing users' private …

Federated Distillation: A Survey

L Li, J Gou, B Yu, L Du, ZYD Tao - arxiv preprint arxiv:2404.08564, 2024 - arxiv.org
Federated Learning (FL) seeks to train a model collaboratively without sharing private
training data from individual clients. Despite its promise, FL encounters challenges such as …

Advances in Robust Federated Learning: A Survey with Heterogeneity Considerations

C Chen, T Liao, X Deng, Z Wu… - IEEE Transactions on …, 2025 - ieeexplore.ieee.org
In the field of heterogeneous federated learning (FL), the key challenge is to efficiently and
collaboratively train models across multiple clients with different data distributions, model …

LDGAN: Latent Determined Ensemble Helps Removing IID Data Assumption and Cross-node Sampling in Distributed GANs

W Wang, Z Wu, X **ang, Y Li - 2022 26th International …, 2022 - ieeexplore.ieee.org
Generative Adversarial Networks (GANs) have received a lot of attention due to their
powerful generative ability, and many related studies have been carried out. Among them …

An Adaptive Aggregation Method for Federated Learning via Meta Controller

T Shen, Z Li, Z Zhao, D Zhu, Z Lv, S Zhang… - Proceedings of the 6th …, 2024 - dl.acm.org
Federated learning (FL) emerged as a novel machine learning setting that enables
collaboratively training deep models on decentralized clients with privacy constraints. In …

Knowledge Distillation Enables Federated Learning: A Data-free Federated Aggregation Scheme

J Huang, Y Zhang, R Bi, J Lin… - 2024 International Joint …, 2024 - ieeexplore.ieee.org
Applying knowledge distillation (KD) in federated learning (FL) can transfer model
knowledge between clients' local models and global model, which helps to improve the …

SDWD: Style Diversity Weighted Distance Evaluates the Intra-Class Data Diversity of Distributed GANs

W Wang, Z Wu, M Zhang, Y Li - 2023 IEEE International …, 2023 - ieeexplore.ieee.org
Due to the distributed storage of massive data, efficient deployment of Generative
Adversarial Networks (GANs) in distributed scenarios has become a hot topic. This paper …