Visual tuning

BXB Yu, J Chang, H Wang, L Liu, S Wang… - ACM Computing …, 2024 - dl.acm.org
Fine-tuning visual models has been widely shown promising performance on many
downstream visual tasks. With the surprising development of pre-trained visual foundation …

Knowledge distillation from a stronger teacher

T Huang, S You, F Wang, C Qian… - Advances in Neural …, 2022 - proceedings.neurips.cc
Unlike existing knowledge distillation methods focus on the baseline settings, where the
teacher models and training strategies are not that strong and competing as state-of-the-art …

Focal and global knowledge distillation for detectors

Z Yang, Z Li, X Jiang, Y Gong, Z Yuan… - Proceedings of the …, 2022 - openaccess.thecvf.com
Abstract Knowledge distillation has been applied to image classification successfully.
However, object detection is much more sophisticated and most knowledge distillation …

Masked generative distillation

Z Yang, Z Li, M Shao, D Shi, Z Yuan, C Yuan - European Conference on …, 2022 - Springer
Abstract Knowledge distillation has been applied to various tasks successfully. The current
distillation algorithm usually improves students' performance by imitating the output of the …

Knowledge diffusion for distillation

T Huang, Y Zhang, M Zheng, S You… - Advances in …, 2023 - proceedings.neurips.cc
The representation gap between teacher and student is an emerging topic in knowledge
distillation (KD). To reduce the gap and improve the performance, current methods often …

Knowledge distillation via the target-aware transformer

S Lin, H **e, B Wang, K Yu, X Chang… - Proceedings of the …, 2022 - openaccess.thecvf.com
Abstract Knowledge distillation becomes a de facto standard to improve the performance of
small neural networks. Most of the previous works propose to regress the representational …

Channel-wise knowledge distillation for dense prediction

C Shu, Y Liu, J Gao, Z Yan… - Proceedings of the IEEE …, 2021 - openaccess.thecvf.com
Abstract Knowledge distillation (KD) has been proven a simple and effective tool for training
compact dense prediction models. Lightweight student networks are trained by extra …

When object detection meets knowledge distillation: A survey

Z Li, P Xu, X Chang, L Yang, Y Zhang… - … on Pattern Analysis …, 2023 - ieeexplore.ieee.org
Object detection (OD) is a crucial computer vision task that has seen the development of
many algorithms and models over the years. While the performance of current OD models …

Automated knowledge distillation via monte carlo tree search

L Li, P Dong, Z Wei, Y Yang - Proceedings of the IEEE/CVF …, 2023 - openaccess.thecvf.com
In this paper, we present Auto-KD, the first automated search framework for optimal
knowledge distillation design. Traditional distillation techniques typically require handcrafted …

Consistency-and dependence-guided knowledge distillation for object detection in remote sensing images

Y Chen, M Lin, Z He, K Polat, A Alhudhaif… - Expert Systems with …, 2023 - Elsevier
As one of the challenging tasks in the remote sensing (RS), object detection has been
successfully applied in many fields. Convolution neural network (CNN) has recently …