Distilling knowledge via knowledge review

P Chen, S Liu, H Zhao, J Jia - Proceedings of the IEEE/CVF …, 2021 - openaccess.thecvf.com
Abstract Knowledge distillation transfers knowledge from the teacher network to the student
one, with the goal of greatly improving the performance of the student network. Previous …

Knowledge distillation and student-teacher learning for visual intelligence: A review and new outlooks

L Wang, KJ Yoon - IEEE transactions on pattern analysis and …, 2021 - ieeexplore.ieee.org
Deep neural models, in recent years, have been successful in almost every field, even
solving the most complex problem statements. However, these models are huge in size with …

Decoupled knowledge distillation

B Zhao, Q Cui, R Song, Y Qiu… - Proceedings of the IEEE …, 2022 - openaccess.thecvf.com
State-of-the-art distillation methods are mainly based on distilling deep features from
intermediate layers, while the significance of logit distillation is greatly overlooked. To …

Knowledge distillation: A survey

J Gou, B Yu, SJ Maybank, D Tao - International Journal of Computer Vision, 2021 - Springer
In recent years, deep neural networks have been successful in both industry and academia,
especially for computer vision tasks. The great success of deep learning is mainly due to its …

Unifying voxel-based representation with transformer for 3d object detection

Y Li, Y Chen, X Qi, Z Li, J Sun… - Advances in Neural …, 2022 - proceedings.neurips.cc
In this work, we present a unified framework for multi-modality 3D object detection, named
UVTR. The proposed method aims to unify multi-modality representations in the voxel space …

Focal and global knowledge distillation for detectors

Z Yang, Z Li, X Jiang, Y Gong, Z Yuan… - Proceedings of the …, 2022 - openaccess.thecvf.com
Abstract Knowledge distillation has been applied to image classification successfully.
However, object detection is much more sophisticated and most knowledge distillation …

Knowledge distillation from a stronger teacher

T Huang, S You, F Wang, C Qian… - Advances in Neural …, 2022 - proceedings.neurips.cc
Unlike existing knowledge distillation methods focus on the baseline settings, where the
teacher models and training strategies are not that strong and competing as state-of-the-art …

Cross-image relational knowledge distillation for semantic segmentation

C Yang, H Zhou, Z An, X Jiang… - Proceedings of the …, 2022 - openaccess.thecvf.com
Abstract Current Knowledge Distillation (KD) methods for semantic segmentation often
guide the student to mimic the teacher's structured information generated from individual …

Masked generative distillation

Z Yang, Z Li, M Shao, D Shi, Z Yuan, C Yuan - European Conference on …, 2022 - Springer
Abstract Knowledge distillation has been applied to various tasks successfully. The current
distillation algorithm usually improves students' performance by imitating the output of the …

Neural feature fusion fields: 3d distillation of self-supervised 2d image representations

V Tschernezki, I Laina, D Larlus… - … Conference on 3D …, 2022 - ieeexplore.ieee.org
We present Neural Feature Fusion Fields (N3F),\a method that improves dense 2D image
feature extractors when the latter are applied to the analysis of multiple images …