A comprehensive survey on test-time adaptation under distribution shifts
Abstract Machine learning methods strive to acquire a robust model during the training
process that can effectively generalize to test samples, even in the presence of distribution …
process that can effectively generalize to test samples, even in the presence of distribution …
Adaptive data-free quantization
Data-free quantization (DFQ) recovers the performance of quantized network (Q) without the
original data, but generates the fake sample via a generator (G) by learning from full …
original data, but generates the fake sample via a generator (G) by learning from full …
Intraq: Learning synthetic images with intra-class heterogeneity for zero-shot network quantization
Learning to synthesize data has emerged as a promising direction in zero-shot quantization
(ZSQ), which represents neural networks by low-bit integer without accessing any of the real …
(ZSQ), which represents neural networks by low-bit integer without accessing any of the real …
Data-free quantization via mixed-precision compensation without fine-tuning
Neural network quantization is a very promising solution in the field of model compression,
but its resulting accuracy highly depends on a training/fine-tuning process and requires the …
but its resulting accuracy highly depends on a training/fine-tuning process and requires the …
Qimera: Data-free quantization with synthetic boundary supporting samples
Abstract Model quantization is known as a promising method to compress deep neural
networks, especially for inferences on lightweight mobile or edge devices. However, model …
networks, especially for inferences on lightweight mobile or edge devices. However, model …
Learning to retain while acquiring: Combating distribution-shift in adversarial data-free knowledge distillation
Abstract Data-free Knowledge Distillation (DFKD) has gained popularity recently, with the
fundamental idea of carrying out knowledge transfer from a Teacher neural network to a …
fundamental idea of carrying out knowledge transfer from a Teacher neural network to a …
Hard sample matters a lot in zero-shot quantization
H Li, X Wu, F Lv, D Liao, TH Li… - Proceedings of the …, 2023 - openaccess.thecvf.com
Zero-shot quantization (ZSQ) is promising for compressing and accelerating deep neural
networks when the data for training full-precision models are inaccessible. In ZSQ, network …
networks when the data for training full-precision models are inaccessible. In ZSQ, network …
Squant: On-the-fly data-free quantization via diagonal hessian approximation
Quantization of deep neural networks (DNN) has been proven effective for compressing and
accelerating DNN models. Data-free quantization (DFQ) is a promising approach without the …
accelerating DNN models. Data-free quantization (DFQ) is a promising approach without the …
Diverse sample generation: Pushing the limit of generative data-free quantization
Generative data-free quantization emerges as a practical compression approach that
quantizes deep neural networks to low bit-width without accessing the real data. This …
quantizes deep neural networks to low bit-width without accessing the real data. This …
It's all in the teacher: Zero-shot quantization brought closer to the teacher
Abstract Model quantization is considered as a promising method to greatly reduce the
resource requirements of deep neural networks. To deal with the performance drop induced …
resource requirements of deep neural networks. To deal with the performance drop induced …