Distilling knowledge via knowledge review
Abstract Knowledge distillation transfers knowledge from the teacher network to the student
one, with the goal of greatly improving the performance of the student network. Previous …
one, with the goal of greatly improving the performance of the student network. Previous …
NTIRE 2023 challenge on efficient super-resolution: Methods and results
This paper reviews the NTIRE 2023 challenge on efficient single-image super-resolution
with a focus on the proposed solutions and results. The aim of this challenge is to devise a …
with a focus on the proposed solutions and results. The aim of this challenge is to devise a …
Decoupled knowledge distillation
State-of-the-art distillation methods are mainly based on distilling deep features from
intermediate layers, while the significance of logit distillation is greatly overlooked. To …
intermediate layers, while the significance of logit distillation is greatly overlooked. To …
Anomaly detection via reverse distillation from one-class embedding
Abstract Knowledge distillation (KD) achieves promising results on the challenging problem
of unsupervised anomaly detection (AD). The representation discrepancy of anomalies in …
of unsupervised anomaly detection (AD). The representation discrepancy of anomalies in …
Point-to-voxel knowledge distillation for lidar semantic segmentation
This article addresses the problem of distilling knowledge from a large teacher model to a
slim student network for LiDAR semantic segmentation. Directly employing previous …
slim student network for LiDAR semantic segmentation. Directly employing previous …
Knowledge distillation from a stronger teacher
Unlike existing knowledge distillation methods focus on the baseline settings, where the
teacher models and training strategies are not that strong and competing as state-of-the-art …
teacher models and training strategies are not that strong and competing as state-of-the-art …
Focal and global knowledge distillation for detectors
Abstract Knowledge distillation has been applied to image classification successfully.
However, object detection is much more sophisticated and most knowledge distillation …
However, object detection is much more sophisticated and most knowledge distillation …
Bridging the gap between object and image-level representations for open-vocabulary detection
Existing open-vocabulary object detectors typically enlarge their vocabulary sizes by
leveraging different forms of weak supervision. This helps generalize to novel objects at …
leveraging different forms of weak supervision. This helps generalize to novel objects at …
Knowledge distillation with the reused teacher classifier
Abstract Knowledge distillation aims to compress a powerful yet cumbersome teacher model
into a lightweight student model without much sacrifice of performance. For this purpose …
into a lightweight student model without much sacrifice of performance. For this purpose …
Cross-image relational knowledge distillation for semantic segmentation
Abstract Current Knowledge Distillation (KD) methods for semantic segmentation often
guide the student to mimic the teacher's structured information generated from individual …
guide the student to mimic the teacher's structured information generated from individual …