Diffusion Model Empowered Efficient Data Distillation Method for Cloud-Edge Collaboration

Z Chai, Y Lin, Z Gao, X Yu, Z **e - IEEE Transactions on …, 2025 - ieeexplore.ieee.org
The application of AI-generated models demands substantial amounts of data, which not
only increases training time and memory consumption but also poses challenges to …

Peak-Controlled Logits Poisoning Attack in Federated Distillation

Y Tang, A Zhang, Z Wu, B Gao, T Wen, Y Wang… - arxiv preprint arxiv …, 2024 - arxiv.org
Federated Distillation (FD) offers an innovative approach to distributed machine learning,
leveraging knowledge distillation for efficient and flexible cross-device knowledge transfer …

Beyond Model Scale Limits: End-Edge-Cloud Federated Learning with Self-Rectified Knowledge Agglomeration

Z Wu, S Sun, Y Wang, M Liu, K Xu, Q Pan… - arxiv preprint arxiv …, 2025 - arxiv.org
The rise of End-Edge-Cloud Collaboration (EECC) offers a promising paradigm for Artificial
Intelligence (AI) model training across end devices, edge servers, and cloud data centers …

Knowledge Distillation in Federated Edge Learning: A Survey

Z Wu, S Sun, Y Wang, M Liu, X Jiang, R Li, B Gao - Authorea Preprints, 2024 - techrxiv.org
The increasing demand for intelligent services coupled with privacy protection of mobile and
Internet of Things (IoT) devices motivates the widespread adoption of Federated Edge …