Deep convolutional neural networks for image classification: A comprehensive review

W Rawat, Z Wang - Neural computation, 2017 - ieeexplore.ieee.org
Convolutional neural networks (CNNs) have been applied to visual tasks since the late
1980s. However, despite a few scattered applications, they were dormant until the mid …

Model compression and acceleration for deep neural networks: The principles, progress, and challenges

Y Cheng, D Wang, P Zhou… - IEEE Signal Processing …, 2018 - ieeexplore.ieee.org
In recent years, deep neural networks (DNNs) have received increased attention, have been
applied to different applications, and achieved dramatic accuracy improvements in many …

Nerv: Neural representations for videos

H Chen, B He, H Wang, Y Ren… - Advances in Neural …, 2021 - proceedings.neurips.cc
We propose a novel neural representation for videos (NeRV) which encodes videos in
neural networks. Unlike conventional representations that treat videos as frame sequences …

Knowledge distillation: A survey

J Gou, B Yu, SJ Maybank, D Tao - International Journal of Computer Vision, 2021 - Springer
In recent years, deep neural networks have been successful in both industry and academia,
especially for computer vision tasks. The great success of deep learning is mainly due to its …

Patient knowledge distillation for bert model compression

S Sun, Y Cheng, Z Gan, J Liu - arxiv preprint arxiv:1908.09355, 2019 - arxiv.org
Pre-trained language models such as BERT have proven to be highly effective for natural
language processing (NLP) tasks. However, the high demand for computing resources in …

A survey of model compression and acceleration for deep neural networks

Y Cheng, D Wang, P Zhou, T Zhang - arxiv preprint arxiv:1710.09282, 2017 - arxiv.org
Deep neural networks (DNNs) have recently achieved great success in many visual
recognition tasks. However, existing deep neural network models are computationally …

Multi-level wavelet convolutional neural networks

P Liu, H Zhang, W Lian, W Zuo - IEEE Access, 2019 - ieeexplore.ieee.org
In computer vision, convolutional networks (CNNs) often adopt pooling to enlarge receptive
field which has the advantage of low computational complexity. However, pooling can cause …

Enlarging smaller images before inputting into convolutional neural network: zero-padding vs. interpolation

M Hashemi - Journal of Big Data, 2019 - Springer
The input to a machine learning model is a one-dimensional feature vector. However, in
recent learning models, such as convolutional and recurrent neural networks, two-and three …

Bert-of-theseus: Compressing bert by progressive module replacing

C Xu, W Zhou, T Ge, F Wei, M Zhou - arxiv preprint arxiv:2002.02925, 2020 - arxiv.org
In this paper, we propose a novel model compression approach to effectively compress
BERT by progressive module replacing. Our approach first divides the original BERT into …

Unsupervised deep generative adversarial hashing network

KG Dizaji, F Zheng, N Sadoughi… - Proceedings of the …, 2018 - openaccess.thecvf.com
Unsupervised deep hash functions have not shown satisfactory improvements against the
shallow alternatives, and usually, require supervised pretraining to avoid getting stuck in …