Socialized learning: A survey of the paradigm shift for edge intelligence in networked systems
Amidst the robust impetus from artificial intelligence (AI) and big data, edge intelligence (EI)
has emerged as a nascent computing paradigm, synthesizing AI with edge computing (EC) …
has emerged as a nascent computing paradigm, synthesizing AI with edge computing (EC) …
Digital twin enhanced federated reinforcement learning with lightweight knowledge distillation in mobile networks
The high-speed mobile networks offer great potentials to many future intelligent applications,
such as autonomous vehicles in smart transportation systems. Such networks provide the …
such as autonomous vehicles in smart transportation systems. Such networks provide the …
A cooperative vehicle-infrastructure system for road hazards detection with edge intelligence
Road hazards (RH) have always been the cause of many serious traffic accidents. These
have posed a threat to the safety of drivers, passengers, and pedestrians, and have also …
have posed a threat to the safety of drivers, passengers, and pedestrians, and have also …
Applications of knowledge distillation in remote sensing: A survey
With the ever-growing complexity of models in the field of remote sensing (RS), there is an
increasing demand for solutions that balance model accuracy with computational efficiency …
increasing demand for solutions that balance model accuracy with computational efficiency …
Online knowledge distillation via mutual contrastive learning for visual recognition
The teacher-free online Knowledge Distillation (KD) aims to train an ensemble of multiple
student models collaboratively and distill knowledge from each other. Although existing …
student models collaboratively and distill knowledge from each other. Although existing …
Knowledge condensation distillation
Abstract Knowledge Distillation (KD) transfers the knowledge from a high-capacity teacher
network to strengthen a smaller student. Existing methods focus on excavating the …
network to strengthen a smaller student. Existing methods focus on excavating the …
Adaptive hierarchy-branch fusion for online knowledge distillation
Abstract Online Knowledge Distillation (OKD) is designed to alleviate the dilemma that the
high-capacity pre-trained teacher model is not available. However, the existing methods …
high-capacity pre-trained teacher model is not available. However, the existing methods …
Kd-lightnet: A lightweight network based on knowledge distillation for industrial defect detection
J Liu, H Li, F Zuo, Z Zhao, S Lu - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
At present, the method based on deep learning performs well in public object detection
tasks. However, there are still two problems to be solved for industrial defect detection: 1) …
tasks. However, there are still two problems to be solved for industrial defect detection: 1) …
Reaf: Remembering enhancement and entropy-based asymptotic forgetting for filter pruning
Neurologically, filter pruning is a procedure of forgetting and remembering recovering.
Prevailing methods directly forget less important information from an unrobust baseline at …
Prevailing methods directly forget less important information from an unrobust baseline at …
Pathological image classification via embedded fusion mutual learning
Deep learning models have been widely used in pathological image classification.
However, most researches employ complex but inefficient neural networks to implement this …
However, most researches employ complex but inefficient neural networks to implement this …