A survey on approximate edge AI for energy efficient autonomous driving services

D Katare, D Perino, J Nurmi, M Warnier… - … Surveys & Tutorials, 2023 - ieeexplore.ieee.org
Autonomous driving services depends on active sensing from modules such as camera,
LiDAR, radar, and communication units. Traditionally, these modules process the sensed …

Revisiting random channel pruning for neural network compression

Y Li, K Adamczewski, W Li, S Gu… - Proceedings of the …, 2022 - openaccess.thecvf.com
Channel (or 3D filter) pruning serves as an effective way to accelerate the inference of
neural networks. There has been a flurry of algorithms that try to solve this practical problem …

Recdis-snn: Rectifying membrane potential distribution for directly training spiking neural networks

Y Guo, X Tong, Y Chen, L Zhang… - Proceedings of the …, 2022 - openaccess.thecvf.com
The brain-inspired and event-driven Spiking Neural Network (SNN) aims at mimicking the
synaptic activity of biological neurons, which transmits binary spike signals between network …

Qdrop: Randomly drop** quantization for extremely low-bit post-training quantization

X Wei, R Gong, Y Li, X Liu, F Yu - ar** neuromorphic intelligence on event-based datasets with Spiking Neural
Networks (SNNs) has recently attracted much research attention. However, the limited size …

Data-free knowledge distillation via feature exchange and activation region constraint

S Yu, J Chen, H Han, S Jiang - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
Despite the tremendous progress on data-free knowledge distillation (DFKD) based on
synthetic data generation, there are still limitations in diverse and efficient data synthesis. It …

Hard sample matters a lot in zero-shot quantization

H Li, X Wu, F Lv, D Liao, TH Li… - Proceedings of the …, 2023 - openaccess.thecvf.com
Zero-shot quantization (ZSQ) is promising for compressing and accelerating deep neural
networks when the data for training full-precision models are inaccessible. In ZSQ, network …

Small scale data-free knowledge distillation

H Liu, Y Wang, H Liu, F Sun… - Proceedings of the IEEE …, 2024 - openaccess.thecvf.com
Data-free knowledge distillation is able to utilize the knowledge learned by a large teacher
network to augment the training of a smaller student network without accessing the original …

Data-free knowledge transfer: A survey

Y Liu, W Zhang, J Wang, J Wang - arxiv preprint arxiv:2112.15278, 2021 - arxiv.org
In the last decade, many deep learning models have been well trained and made a great
success in various fields of machine intelligence, especially for computer vision and natural …

MQBench: Towards reproducible and deployable model quantization benchmark

Y Li, M Shen, J Ma, Y Ren, M Zhao, Q Zhang… - arxiv preprint arxiv …, 2021 - arxiv.org
Model quantization has emerged as an indispensable technique to accelerate deep
learning inference. While researchers continue to push the frontier of quantization …