Adaptive data-free quantization
Data-free quantization (DFQ) recovers the performance of quantized network (Q) without the
original data, but generates the fake sample via a generator (G) by learning from full …
original data, but generates the fake sample via a generator (G) by learning from full …
Intraq: Learning synthetic images with intra-class heterogeneity for zero-shot network quantization
Learning to synthesize data has emerged as a promising direction in zero-shot quantization
(ZSQ), which represents neural networks by low-bit integer without accessing any of the real …
(ZSQ), which represents neural networks by low-bit integer without accessing any of the real …
Data-free quantization via mixed-precision compensation without fine-tuning
Neural network quantization is a very promising solution in the field of model compression,
but its resulting accuracy highly depends on a training/fine-tuning process and requires the …
but its resulting accuracy highly depends on a training/fine-tuning process and requires the …
Hard sample matters a lot in zero-shot quantization
H Li, X Wu, F Lv, D Liao, TH Li… - Proceedings of the …, 2023 - openaccess.thecvf.com
Zero-shot quantization (ZSQ) is promising for compressing and accelerating deep neural
networks when the data for training full-precision models are inaccessible. In ZSQ, network …
networks when the data for training full-precision models are inaccessible. In ZSQ, network …
Retrospective adversarial replay for continual learning
Continual learning is an emerging research challenge in machine learning that addresses
the problem where models quickly fit the most recently trained-on data but suffer from …
the problem where models quickly fit the most recently trained-on data but suffer from …
It's all in the teacher: Zero-shot quantization brought closer to the teacher
Abstract Model quantization is considered as a promising method to greatly reduce the
resource requirements of deep neural networks. To deal with the performance drop induced …
resource requirements of deep neural networks. To deal with the performance drop induced …
Unified data-free compression: Pruning and quantization without fine-tuning
S Bai, J Chen, X Shen, Y Qian… - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
Structured pruning and quantization are promising approaches for reducing the inference
time and memory footprint of neural networks. However, most existing methods require the …
time and memory footprint of neural networks. However, most existing methods require the …
Rethinking data-free quantization as a zero-sum game
Data-free quantization (DFQ) recovers the performance of quantized network (Q) without
accessing the real data, but generates the fake sample via a generator (G) by learning from …
accessing the real data, but generates the fake sample via a generator (G) by learning from …
Clamp-vit: Contrastive data-free learning for adaptive post-training quantization of vits
We present CLAMP-ViT, a data-free post-training quantization method for vision
transformers (ViTs). We identify the limitations of recent techniques, notably their inability to …
transformers (ViTs). We identify the limitations of recent techniques, notably their inability to …
[HTML][HTML] NIDS-Vis: Improving the generalized adversarial robustness of network intrusion detection system
Abstract Network Intrusion Detection Systems (NIDSes) are crucial for securing various
networks from malicious attacks. Recent developments in Deep Neural Networks (DNNs) …
networks from malicious attacks. Recent developments in Deep Neural Networks (DNNs) …