Holistic network virtualization and pervasive network intelligence for 6G

X Shen, J Gao, W Wu, M Li, C Zhou… - … Surveys & Tutorials, 2021 - ieeexplore.ieee.org
In this tutorial paper, we look into the evolution and prospect of network architecture and
propose a novel conceptual architecture for the 6th generation (6G) networks. The proposed …

Weakly supervised object localization and detection: A survey

D Zhang, J Han, G Cheng… - IEEE transactions on …, 2021 - ieeexplore.ieee.org
As an emerging and challenging problem in the computer vision community, weakly
supervised object localization and detection plays an important role for develo** new …

YOLOv6: A single-stage object detection framework for industrial applications

C Li, L Li, H Jiang, K Weng, Y Geng, L Li, Z Ke… - arxiv preprint arxiv …, 2022 - arxiv.org
For years, the YOLO series has been the de facto industry-level standard for efficient object
detection. The YOLO community has prospered overwhelmingly to enrich its use in a …

2dpass: 2d priors assisted semantic segmentation on lidar point clouds

X Yan, J Gao, C Zheng, C Zheng, R Zhang… - … on Computer Vision, 2022 - Springer
As camera and LiDAR sensors capture complementary information in autonomous driving,
great efforts have been made to conduct semantic segmentation through multi-modality data …

Focal and global knowledge distillation for detectors

Z Yang, Z Li, X Jiang, Y Gong, Z Yuan… - Proceedings of the …, 2022 - openaccess.thecvf.com
Abstract Knowledge distillation has been applied to image classification successfully.
However, object detection is much more sophisticated and most knowledge distillation …

Knowledge distillation with the reused teacher classifier

D Chen, JP Mei, H Zhang, C Wang… - Proceedings of the …, 2022 - openaccess.thecvf.com
Abstract Knowledge distillation aims to compress a powerful yet cumbersome teacher model
into a lightweight student model without much sacrifice of performance. For this purpose …

Masked generative distillation

Z Yang, Z Li, M Shao, D Shi, Z Yuan, C Yuan - European Conference on …, 2022 - Springer
Abstract Knowledge distillation has been applied to various tasks successfully. The current
distillation algorithm usually improves students' performance by imitating the output of the …

Knowledge diffusion for distillation

T Huang, Y Zhang, M Zheng, S You… - Advances in …, 2023 - proceedings.neurips.cc
The representation gap between teacher and student is an emerging topic in knowledge
distillation (KD). To reduce the gap and improve the performance, current methods often …

Knowledge distillation: A survey

J Gou, B Yu, SJ Maybank, D Tao - International Journal of Computer Vision, 2021 - Springer
In recent years, deep neural networks have been successful in both industry and academia,
especially for computer vision tasks. The great success of deep learning is mainly due to its …

Filtering, distillation, and hard negatives for vision-language pre-training

F Radenovic, A Dubey, A Kadian… - Proceedings of the …, 2023 - openaccess.thecvf.com
Vision-language models trained with contrastive learning on large-scale noisy data are
becoming increasingly popular for zero-shot recognition problems. In this paper we improve …