A comprehensive survey on test-time adaptation under distribution shifts

J Liang, R He, T Tan - International Journal of Computer Vision, 2024 - Springer
Abstract Machine learning methods strive to acquire a robust model during the training
process that can effectively generalize to test samples, even in the presence of distribution …

Adaptive data-free quantization

B Qian, Y Wang, R Hong… - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
Data-free quantization (DFQ) recovers the performance of quantized network (Q) without the
original data, but generates the fake sample via a generator (G) by learning from full …

Intraq: Learning synthetic images with intra-class heterogeneity for zero-shot network quantization

Y Zhong, M Lin, G Nan, J Liu… - Proceedings of the …, 2022 - openaccess.thecvf.com
Learning to synthesize data has emerged as a promising direction in zero-shot quantization
(ZSQ), which represents neural networks by low-bit integer without accessing any of the real …

Data-free quantization via mixed-precision compensation without fine-tuning

J Chen, S Bai, T Huang, M Wang, G Tian, Y Liu - Pattern Recognition, 2023 - Elsevier
Neural network quantization is a very promising solution in the field of model compression,
but its resulting accuracy highly depends on a training/fine-tuning process and requires the …

Qimera: Data-free quantization with synthetic boundary supporting samples

K Choi, D Hong, N Park, Y Kim… - Advances in Neural …, 2021 - proceedings.neurips.cc
Abstract Model quantization is known as a promising method to compress deep neural
networks, especially for inferences on lightweight mobile or edge devices. However, model …

Learning to retain while acquiring: Combating distribution-shift in adversarial data-free knowledge distillation

G Patel, KR Mopuri, Q Qiu - … of the IEEE/CVF Conference on …, 2023 - openaccess.thecvf.com
Abstract Data-free Knowledge Distillation (DFKD) has gained popularity recently, with the
fundamental idea of carrying out knowledge transfer from a Teacher neural network to a …

Hard sample matters a lot in zero-shot quantization

H Li, X Wu, F Lv, D Liao, TH Li… - Proceedings of the …, 2023 - openaccess.thecvf.com
Zero-shot quantization (ZSQ) is promising for compressing and accelerating deep neural
networks when the data for training full-precision models are inaccessible. In ZSQ, network …

Squant: On-the-fly data-free quantization via diagonal hessian approximation

C Guo, Y Qiu, J Leng, X Gao, C Zhang, Y Liu… - arxiv preprint arxiv …, 2022 - arxiv.org
Quantization of deep neural networks (DNN) has been proven effective for compressing and
accelerating DNN models. Data-free quantization (DFQ) is a promising approach without the …

Diverse sample generation: Pushing the limit of generative data-free quantization

H Qin, Y Ding, X Zhang, J Wang… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Generative data-free quantization emerges as a practical compression approach that
quantizes deep neural networks to low bit-width without accessing the real data. This …

It's all in the teacher: Zero-shot quantization brought closer to the teacher

K Choi, HY Lee, D Hong, J Yu… - Proceedings of the …, 2022 - openaccess.thecvf.com
Abstract Model quantization is considered as a promising method to greatly reduce the
resource requirements of deep neural networks. To deal with the performance drop induced …