Socialized learning: A survey of the paradigm shift for edge intelligence in networked systems

X Wang, Y Zhao, C Qiu, Q Hu… - … Surveys & Tutorials, 2024 - ieeexplore.ieee.org
Amidst the robust impetus from artificial intelligence (AI) and big data, edge intelligence (EI)
has emerged as a nascent computing paradigm, synthesizing AI with edge computing (EC) …

Digital twin enhanced federated reinforcement learning with lightweight knowledge distillation in mobile networks

X Zhou, X Zheng, X Cui, J Shi, W Liang… - IEEE Journal on …, 2023 - ieeexplore.ieee.org
The high-speed mobile networks offer great potentials to many future intelligent applications,
such as autonomous vehicles in smart transportation systems. Such networks provide the …

A cooperative vehicle-infrastructure system for road hazards detection with edge intelligence

C Chen, G Yao, L Liu, Q Pei, H Song… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Road hazards (RH) have always been the cause of many serious traffic accidents. These
have posed a threat to the safety of drivers, passengers, and pedestrians, and have also …

Applications of knowledge distillation in remote sensing: A survey

Y Himeur, N Aburaed, O Elharrouss, I Varlamis… - Information …, 2024 - Elsevier
With the ever-growing complexity of models in the field of remote sensing (RS), there is an
increasing demand for solutions that balance model accuracy with computational efficiency …

Online knowledge distillation via mutual contrastive learning for visual recognition

C Yang, Z An, H Zhou, F Zhuang, Y Xu… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
The teacher-free online Knowledge Distillation (KD) aims to train an ensemble of multiple
student models collaboratively and distill knowledge from each other. Although existing …

Knowledge condensation distillation

C Li, M Lin, Z Ding, N Lin, Y Zhuang, Y Huang… - … on Computer Vision, 2022 - Springer
Abstract Knowledge Distillation (KD) transfers the knowledge from a high-capacity teacher
network to strengthen a smaller student. Existing methods focus on excavating the …

Adaptive hierarchy-branch fusion for online knowledge distillation

L Gong, S Lin, B Zhang, Y Shen, K Li, R Qiao… - Proceedings of the …, 2023 - ojs.aaai.org
Abstract Online Knowledge Distillation (OKD) is designed to alleviate the dilemma that the
high-capacity pre-trained teacher model is not available. However, the existing methods …

Kd-lightnet: A lightweight network based on knowledge distillation for industrial defect detection

J Liu, H Li, F Zuo, Z Zhao, S Lu - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
At present, the method based on deep learning performs well in public object detection
tasks. However, there are still two problems to be solved for industrial defect detection: 1) …

Reaf: Remembering enhancement and entropy-based asymptotic forgetting for filter pruning

X Zhang, W **e, Y Li, K Jiang… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Neurologically, filter pruning is a procedure of forgetting and remembering recovering.
Prevailing methods directly forget less important information from an unrobust baseline at …

Pathological image classification via embedded fusion mutual learning

G Li, G Wu, G Xu, C Li, Z Zhu, Y Ye, H Zhang - … Signal Processing and …, 2023 - Elsevier
Deep learning models have been widely used in pathological image classification.
However, most researches employ complex but inefficient neural networks to implement this …