Adversarial attacks and defenses in machine learning-empowered communication systems and networks: A contemporary survey

Y Wang, T Sun, S Li, X Yuan, W Ni… - … Surveys & Tutorials, 2023 - ieeexplore.ieee.org
Adversarial attacks and defenses in machine learning and deep neural network (DNN) have
been gaining significant attention due to the rapidly growing applications of deep learning in …

Logit standardization in knowledge distillation

S Sun, W Ren, J Li, R Wang… - Proceedings of the IEEE …, 2024 - openaccess.thecvf.com
Abstract Knowledge distillation involves transferring soft labels from a teacher to a student
using a shared temperature-based softmax function. However the assumption of a shared …

Curriculum temperature for knowledge distillation

Z Li, X Li, L Yang, B Zhao, R Song, L Luo, J Li… - Proceedings of the …, 2023 - ojs.aaai.org
Most existing distillation methods ignore the flexible role of the temperature in the loss
function and fix it as a hyper-parameter that can be decided by an inefficient grid search. In …

When object detection meets knowledge distillation: A survey

Z Li, P Xu, X Chang, L Yang, Y Zhang… - … on Pattern Analysis …, 2023 - ieeexplore.ieee.org
Object detection (OD) is a crucial computer vision task that has seen the development of
many algorithms and models over the years. While the performance of current OD models …

Localization distillation for dense object detection

Z Zheng, R Ye, P Wang, D Ren… - Proceedings of the …, 2022 - openaccess.thecvf.com
Abstract Knowledge distillation (KD) has witnessed its powerful capability in learning
compact models in object detection. Previous KD methods for object detection mostly focus …

Pseco: Pseudo labeling and consistency training for semi-supervised object detection

G Li, X Li, Y Wang, Y Wu, D Liang, S Zhang - European Conference on …, 2022 - Springer
In this paper, we delve into two key techniques in Semi-Supervised Object Detection
(SSOD), namely pseudo labeling and consistency training. We observe that these two …

Pkd: General distillation framework for object detectors via pearson correlation coefficient

W Cao, Y Zhang, J Gao, A Cheng… - Advances in Neural …, 2022 - proceedings.neurips.cc
Abstract Knowledge distillation (KD) is a widely-used technique to train compact models in
object detection. However, there is still a lack of study on how to distill between …

CrossKD: Cross-head knowledge distillation for object detection

J Wang, Y Chen, Z Zheng, X Li… - Proceedings of the …, 2024 - openaccess.thecvf.com
Abstract Knowledge Distillation (KD) has been validated as an effective model compression
technique for learning compact object detectors. Existing state-of-the-art KD methods for …

Bridging cross-task protocol inconsistency for distillation in dense object detection

L Yang, X Zhou, X Li, L Qiao, Z Li… - Proceedings of the …, 2023 - openaccess.thecvf.com
Abstract Knowledge distillation (KD) has shown potential for learning compact models in
dense object detection. However, the commonly used softmax-based distillation ignores the …

Near-edge computing aware object detection: A review

A Setyanto, TB Sasongko, MA Fikri, IK Kim - IEEE Access, 2023 - ieeexplore.ieee.org
Object detection is a widely applied approach in addressing many real-world computer
vision challenges. Despite its importance, object detection is computationally intensive and …