Diffusion Model Empowered Efficient Data Distillation Method for Cloud-Edge Collaboration
Z Chai, Y Lin, Z Gao, X Yu, Z **e - IEEE Transactions on …, 2025 - ieeexplore.ieee.org
The application of AI-generated models demands substantial amounts of data, which not
only increases training time and memory consumption but also poses challenges to …
only increases training time and memory consumption but also poses challenges to …
Peak-Controlled Logits Poisoning Attack in Federated Distillation
Federated Distillation (FD) offers an innovative approach to distributed machine learning,
leveraging knowledge distillation for efficient and flexible cross-device knowledge transfer …
leveraging knowledge distillation for efficient and flexible cross-device knowledge transfer …
Beyond Model Scale Limits: End-Edge-Cloud Federated Learning with Self-Rectified Knowledge Agglomeration
The rise of End-Edge-Cloud Collaboration (EECC) offers a promising paradigm for Artificial
Intelligence (AI) model training across end devices, edge servers, and cloud data centers …
Intelligence (AI) model training across end devices, edge servers, and cloud data centers …
Knowledge Distillation in Federated Edge Learning: A Survey
The increasing demand for intelligent services coupled with privacy protection of mobile and
Internet of Things (IoT) devices motivates the widespread adoption of Federated Edge …
Internet of Things (IoT) devices motivates the widespread adoption of Federated Edge …