Adaptive data-free quantization

B Qian, Y Wang, R Hong… - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
Data-free quantization (DFQ) recovers the performance of quantized network (Q) without the
original data, but generates the fake sample via a generator (G) by learning from full …

Intraq: Learning synthetic images with intra-class heterogeneity for zero-shot network quantization

Y Zhong, M Lin, G Nan, J Liu… - Proceedings of the …, 2022 - openaccess.thecvf.com
Learning to synthesize data has emerged as a promising direction in zero-shot quantization
(ZSQ), which represents neural networks by low-bit integer without accessing any of the real …

Data-free quantization via mixed-precision compensation without fine-tuning

J Chen, S Bai, T Huang, M Wang, G Tian, Y Liu - Pattern Recognition, 2023 - Elsevier
Neural network quantization is a very promising solution in the field of model compression,
but its resulting accuracy highly depends on a training/fine-tuning process and requires the …

Hard sample matters a lot in zero-shot quantization

H Li, X Wu, F Lv, D Liao, TH Li… - Proceedings of the …, 2023 - openaccess.thecvf.com
Zero-shot quantization (ZSQ) is promising for compressing and accelerating deep neural
networks when the data for training full-precision models are inaccessible. In ZSQ, network …

Retrospective adversarial replay for continual learning

L Kumari, S Wang, T Zhou… - Advances in neural …, 2022 - proceedings.neurips.cc
Continual learning is an emerging research challenge in machine learning that addresses
the problem where models quickly fit the most recently trained-on data but suffer from …

It's all in the teacher: Zero-shot quantization brought closer to the teacher

K Choi, HY Lee, D Hong, J Yu… - Proceedings of the …, 2022 - openaccess.thecvf.com
Abstract Model quantization is considered as a promising method to greatly reduce the
resource requirements of deep neural networks. To deal with the performance drop induced …

Unified data-free compression: Pruning and quantization without fine-tuning

S Bai, J Chen, X Shen, Y Qian… - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
Structured pruning and quantization are promising approaches for reducing the inference
time and memory footprint of neural networks. However, most existing methods require the …

Rethinking data-free quantization as a zero-sum game

B Qian, Y Wang, R Hong, M Wang - … of the AAAI conference on artificial …, 2023 - ojs.aaai.org
Data-free quantization (DFQ) recovers the performance of quantized network (Q) without
accessing the real data, but generates the fake sample via a generator (G) by learning from …

Clamp-vit: Contrastive data-free learning for adaptive post-training quantization of vits

A Ramachandran, S Kundu, T Krishna - European Conference on …, 2024 - Springer
We present CLAMP-ViT, a data-free post-training quantization method for vision
transformers (ViTs). We identify the limitations of recent techniques, notably their inability to …

[HTML][HTML] NIDS-Vis: Improving the generalized adversarial robustness of network intrusion detection system

K He, DD Kim, MR Asghar - Computers & security, 2024 - Elsevier
Abstract Network Intrusion Detection Systems (NIDSes) are crucial for securing various
networks from malicious attacks. Recent developments in Deep Neural Networks (DNNs) …