A review of convolutional neural network architectures and their optimizations

S Cong, Y Zhou - Artificial Intelligence Review, 2023 - Springer
The research advances concerning the typical architectures of convolutional neural
networks (CNNs) as well as their optimizations are analyzed and elaborated in detail in this …

Visual tuning

BXB Yu, J Chang, H Wang, L Liu, S Wang… - ACM Computing …, 2024 - dl.acm.org
Fine-tuning visual models has been widely shown promising performance on many
downstream visual tasks. With the surprising development of pre-trained visual foundation …

Gold-YOLO: Efficient object detector via gather-and-distribute mechanism

C Wang, W He, Y Nie, J Guo, C Liu… - Advances in Neural …, 2024 - proceedings.neurips.cc
In the past years, YOLO-series models have emerged as the leading approaches in the area
of real-time object detection. Many studies pushed up the baseline to a higher level by …

YOLOv6: A single-stage object detection framework for industrial applications

C Li, L Li, H Jiang, K Weng, Y Geng, L Li, Z Ke… - arxiv preprint arxiv …, 2022 - arxiv.org
For years, the YOLO series has been the de facto industry-level standard for efficient object
detection. The YOLO community has prospered overwhelmingly to enrich its use in a …

Vanillanet: the power of minimalism in deep learning

H Chen, Y Wang, J Guo, D Tao - Advances in Neural …, 2023 - proceedings.neurips.cc
At the heart of foundation models is the philosophy of" more is different", exemplified by the
astonishing success in computer vision and natural language processing. However, the …

Focal and global knowledge distillation for detectors

Z Yang, Z Li, X Jiang, Y Gong, Z Yuan… - Proceedings of the …, 2022 - openaccess.thecvf.com
Abstract Knowledge distillation has been applied to image classification successfully.
However, object detection is much more sophisticated and most knowledge distillation …

Masked generative distillation

Z Yang, Z Li, M Shao, D Shi, Z Yuan, C Yuan - European conference on …, 2022 - Springer
Abstract Knowledge distillation has been applied to various tasks successfully. The current
distillation algorithm usually improves students' performance by imitating the output of the …

One-for-all: Bridge the gap between heterogeneous architectures in knowledge distillation

Z Hao, J Guo, K Han, Y Tang, H Hu… - Advances in Neural …, 2023 - proceedings.neurips.cc
Abstract Knowledge distillation (KD) has proven to be a highly effective approach for
enhancing model performance through a teacher-student training scheme. However, most …

Knowledge diffusion for distillation

T Huang, Y Zhang, M Zheng, S You… - Advances in …, 2023 - proceedings.neurips.cc
The representation gap between teacher and student is an emerging topic in knowledge
distillation (KD). To reduce the gap and improve the performance, current methods often …

Mixformerv2: Efficient fully transformer tracking

Y Cui, T Song, G Wu, L Wang - Advances in neural …, 2023 - proceedings.neurips.cc
Transformer-based trackers have achieved strong accuracy on the standard benchmarks.
However, their efficiency remains an obstacle to practical deployment on both GPU and …