Effective whole-body pose estimation with two-stages distillation
Whole-body pose estimation localizes the human body, hand, face, and foot keypoints in an
image. This task is challenging due to multi-scale body parts, fine-grained localization for …
image. This task is challenging due to multi-scale body parts, fine-grained localization for …
Focal and global knowledge distillation for detectors
Abstract Knowledge distillation has been applied to image classification successfully.
However, object detection is much more sophisticated and most knowledge distillation …
However, object detection is much more sophisticated and most knowledge distillation …
Distilling object detectors via decoupled features
Abstract Knowledge distillation is a widely used paradigm for inheriting information from a
complicated teacher network to a compact student network and maintaining the strong …
complicated teacher network to a compact student network and maintaining the strong …
General instance distillation for object detection
In recent years, knowledge distillation has been proved to be an effective solution for model
compression. This approach can make lightweight student models acquire the knowledge …
compression. This approach can make lightweight student models acquire the knowledge …
Overcoming catastrophic forgetting in incremental object detection via elastic response distillation
Traditional object detectors are ill-equipped for incremental learning. However, fine-tuning
directly on a well-trained detection model with only new data will lead to catastrophic …
directly on a well-trained detection model with only new data will lead to catastrophic …
Localization distillation for dense object detection
Abstract Knowledge distillation (KD) has witnessed its powerful capability in learning
compact models in object detection. Previous KD methods for object detection mostly focus …
compact models in object detection. Previous KD methods for object detection mostly focus …
Pkd: General distillation framework for object detectors via pearson correlation coefficient
Abstract Knowledge distillation (KD) is a widely-used technique to train compact models in
object detection. However, there is still a lack of study on how to distill between …
object detection. However, there is still a lack of study on how to distill between …
Distilling object detectors with feature richness
In recent years, large-scale deep models have achieved great success, but the huge
computational complexity and massive storage requirements make it a great challenge to …
computational complexity and massive storage requirements make it a great challenge to …
Knowledge distillation for object detection via rank mimicking and prediction-guided feature imitation
Abstract Knowledge Distillation (KD) is a widely-used technology to inherit information from
cumbersome teacher models to compact student models, consequently realizing model …
cumbersome teacher models to compact student models, consequently realizing model …
Adaptive knowledge distillation for lightweight remote sensing object detectors optimizing
Lightweight object detector is currently gaining more and more popularity in remote sensing.
In general, it is hard for lightweight detectors to achieve competitive performance compared …
In general, it is hard for lightweight detectors to achieve competitive performance compared …