Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Adaptive data-free quantization
Data-free quantization (DFQ) recovers the performance of quantized network (Q) without the
original data, but generates the fake sample via a generator (G) by learning from full …
original data, but generates the fake sample via a generator (G) by learning from full …
Intraq: Learning synthetic images with intra-class heterogeneity for zero-shot network quantization
Learning to synthesize data has emerged as a promising direction in zero-shot quantization
(ZSQ), which represents neural networks by low-bit integer without accessing any of the real …
(ZSQ), which represents neural networks by low-bit integer without accessing any of the real …
Retrospective adversarial replay for continual learning
Continual learning is an emerging research challenge in machine learning that addresses
the problem where models quickly fit the most recently trained-on data but suffer from …
the problem where models quickly fit the most recently trained-on data but suffer from …
Hard sample matters a lot in zero-shot quantization
H Li, X Wu, F Lv, D Liao, TH Li… - Proceedings of the …, 2023 - openaccess.thecvf.com
Zero-shot quantization (ZSQ) is promising for compressing and accelerating deep neural
networks when the data for training full-precision models are inaccessible. In ZSQ, network …
networks when the data for training full-precision models are inaccessible. In ZSQ, network …
It's all in the teacher: Zero-shot quantization brought closer to the teacher
Abstract Model quantization is considered as a promising method to greatly reduce the
resource requirements of deep neural networks. To deal with the performance drop induced …
resource requirements of deep neural networks. To deal with the performance drop induced …
Unified data-free compression: Pruning and quantization without fine-tuning
S Bai, J Chen, X Shen, Y Qian… - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
Structured pruning and quantization are promising approaches for reducing the inference
time and memory footprint of neural networks. However, most existing methods require the …
time and memory footprint of neural networks. However, most existing methods require the …
Psaq-vit v2: Toward accurate and general data-free quantization for vision transformers
Data-free quantization can potentially address data privacy and security concerns in model
compression and thus has been widely investigated. Recently, patch similarity aware data …
compression and thus has been widely investigated. Recently, patch similarity aware data …
Rethinking data-free quantization as a zero-sum game
Data-free quantization (DFQ) recovers the performance of quantized network (Q) without
accessing the real data, but generates the fake sample via a generator (G) by learning from …
accessing the real data, but generates the fake sample via a generator (G) by learning from …
Clamp-vit: Contrastive data-free learning for adaptive post-training quantization of vits
We present CLAMP-ViT, a data-free post-training quantization method for vision
transformers (ViTs). We identify the limitations of recent techniques, notably their inability to …
transformers (ViTs). We identify the limitations of recent techniques, notably their inability to …
ACQ: Improving generative data-free quantization via attention correction
J Li, X Guo, B Dai, G Gong, M **, G Chen, W Mao… - Pattern Recognition, 2024 - Elsevier
Data-free quantization aims to achieve model quantization without accessing any authentic
sample. It is significant in an application-oriented context involving data privacy. Converting …
sample. It is significant in an application-oriented context involving data privacy. Converting …