Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
A survey on approximate edge AI for energy efficient autonomous driving services
Autonomous driving services depends on active sensing from modules such as camera,
LiDAR, radar, and communication units. Traditionally, these modules process the sensed …
LiDAR, radar, and communication units. Traditionally, these modules process the sensed …
Revisiting random channel pruning for neural network compression
Channel (or 3D filter) pruning serves as an effective way to accelerate the inference of
neural networks. There has been a flurry of algorithms that try to solve this practical problem …
neural networks. There has been a flurry of algorithms that try to solve this practical problem …
Recdis-snn: Rectifying membrane potential distribution for directly training spiking neural networks
Y Guo, X Tong, Y Chen, L Zhang… - Proceedings of the …, 2022 - openaccess.thecvf.com
The brain-inspired and event-driven Spiking Neural Network (SNN) aims at mimicking the
synaptic activity of biological neurons, which transmits binary spike signals between network …
synaptic activity of biological neurons, which transmits binary spike signals between network …
Data-free knowledge distillation via feature exchange and activation region constraint
Despite the tremendous progress on data-free knowledge distillation (DFKD) based on
synthetic data generation, there are still limitations in diverse and efficient data synthesis. It …
synthetic data generation, there are still limitations in diverse and efficient data synthesis. It …
Hard sample matters a lot in zero-shot quantization
H Li, X Wu, F Lv, D Liao, TH Li… - Proceedings of the …, 2023 - openaccess.thecvf.com
Zero-shot quantization (ZSQ) is promising for compressing and accelerating deep neural
networks when the data for training full-precision models are inaccessible. In ZSQ, network …
networks when the data for training full-precision models are inaccessible. In ZSQ, network …
Small scale data-free knowledge distillation
Data-free knowledge distillation is able to utilize the knowledge learned by a large teacher
network to augment the training of a smaller student network without accessing the original …
network to augment the training of a smaller student network without accessing the original …
Data-free knowledge transfer: A survey
In the last decade, many deep learning models have been well trained and made a great
success in various fields of machine intelligence, especially for computer vision and natural …
success in various fields of machine intelligence, especially for computer vision and natural …
MQBench: Towards reproducible and deployable model quantization benchmark
Model quantization has emerged as an indispensable technique to accelerate deep
learning inference. While researchers continue to push the frontier of quantization …
learning inference. While researchers continue to push the frontier of quantization …