Effective whole-body pose estimation with two-stages distillation

Z Yang, A Zeng, C Yuan, Y Li - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
Whole-body pose estimation localizes the human body, hand, face, and foot keypoints in an
image. This task is challenging due to multi-scale body parts, fine-grained localization for …

Focal and global knowledge distillation for detectors

Z Yang, Z Li, X Jiang, Y Gong, Z Yuan… - Proceedings of the …, 2022 - openaccess.thecvf.com
Abstract Knowledge distillation has been applied to image classification successfully.
However, object detection is much more sophisticated and most knowledge distillation …

Distilling object detectors via decoupled features

J Guo, K Han, Y Wang, H Wu… - Proceedings of the …, 2021 - openaccess.thecvf.com
Abstract Knowledge distillation is a widely used paradigm for inheriting information from a
complicated teacher network to a compact student network and maintaining the strong …

General instance distillation for object detection

X Dai, Z Jiang, Z Wu, Y Bao, Z Wang… - Proceedings of the …, 2021 - openaccess.thecvf.com
In recent years, knowledge distillation has been proved to be an effective solution for model
compression. This approach can make lightweight student models acquire the knowledge …

Overcoming catastrophic forgetting in incremental object detection via elastic response distillation

T Feng, M Wang, H Yuan - … of the IEEE/CVF conference on …, 2022 - openaccess.thecvf.com
Traditional object detectors are ill-equipped for incremental learning. However, fine-tuning
directly on a well-trained detection model with only new data will lead to catastrophic …

Localization distillation for dense object detection

Z Zheng, R Ye, P Wang, D Ren… - Proceedings of the …, 2022 - openaccess.thecvf.com
Abstract Knowledge distillation (KD) has witnessed its powerful capability in learning
compact models in object detection. Previous KD methods for object detection mostly focus …

Pkd: General distillation framework for object detectors via pearson correlation coefficient

W Cao, Y Zhang, J Gao, A Cheng… - Advances in Neural …, 2022 - proceedings.neurips.cc
Abstract Knowledge distillation (KD) is a widely-used technique to train compact models in
object detection. However, there is still a lack of study on how to distill between …

Distilling object detectors with feature richness

D Zhixing, R Zhang, M Chang, S Liu… - Advances in …, 2021 - proceedings.neurips.cc
In recent years, large-scale deep models have achieved great success, but the huge
computational complexity and massive storage requirements make it a great challenge to …

Knowledge distillation for object detection via rank mimicking and prediction-guided feature imitation

G Li, X Li, Y Wang, S Zhang, Y Wu… - Proceedings of the AAAI …, 2022 - ojs.aaai.org
Abstract Knowledge Distillation (KD) is a widely-used technology to inherit information from
cumbersome teacher models to compact student models, consequently realizing model …

Adaptive knowledge distillation for lightweight remote sensing object detectors optimizing

Y Yang, X Sun, W Diao, H Li, Y Wu… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
Lightweight object detector is currently gaining more and more popularity in remote sensing.
In general, it is hard for lightweight detectors to achieve competitive performance compared …