NTIRE 2023 challenge on efficient super-resolution: Methods and results

Y Li, Y Zhang, R Timofte, L Van Gool… - Proceedings of the …, 2023 - openaccess.thecvf.com
This paper reviews the NTIRE 2023 challenge on efficient single-image super-resolution
with a focus on the proposed solutions and results. The aim of this challenge is to devise a …

Structured pruning for deep convolutional neural networks: A survey

Y He, L **ao - IEEE transactions on pattern analysis and …, 2023 - ieeexplore.ieee.org
The remarkable performance of deep Convolutional neural networks (CNNs) is generally
attributed to their deeper and wider architectures, which can come with significant …

Review of lightweight deep convolutional neural networks

F Chen, S Li, J Han, F Ren, Z Yang - Archives of Computational Methods …, 2024 - Springer
Lightweight deep convolutional neural networks (LDCNNs) are vital components of mobile
intelligence, particularly in mobile vision. Although various heavy networks with increasingly …

Loraprune: Pruning meets low-rank parameter-efficient fine-tuning

M Zhang, H Chen, C Shen, Z Yang, L Ou, X Yu… - arxiv preprint arxiv …, 2023 - arxiv.org
Large pre-trained models (LPMs), such as LLaMA and GLM, have shown exceptional
performance across various tasks through fine-tuning. Although low-rank adaption (LoRA) …

Pyra: Parallel yielding re-activation for training-inference efficient task adaptation

Y **ong, H Chen, T Hao, Z Lin, J Han, Y Zhang… - … on Computer Vision, 2024 - Springer
Recently, the scale of transformers has grown rapidly, which introduces considerable
challenges in terms of training overhead and inference efficiency in the scope of task …

Structural alignment for network pruning through partial regularization

S Gao, Z Zhang, Y Zhang, F Huang… - Proceedings of the …, 2023 - openaccess.thecvf.com
In this paper, we propose a novel channel pruning method to reduce the computational and
storage costs of Convolutional Neural Networks (CNNs). Many existing one-shot pruning …

Automatic network pruning via hilbert-schmidt independence criterion lasso under information bottleneck principle

S Guo, L Zhang, X Zheng, Y Wang… - Proceedings of the …, 2023 - openaccess.thecvf.com
Most existing neural network pruning methods hand-crafted their importance criteria and
structures to prune. This constructs heavy and unintended dependencies on heuristics and …

Differentiable transportation pruning

Y Li, JC van Gemert, T Hoefler… - Proceedings of the …, 2023 - openaccess.thecvf.com
Deep learning algorithms are increasingly employed at the edge. However, edge devices
are resource constrained and thus require efficient deployment of deep neural networks …

Evc: Towards real-time neural image compression with mask decay

GH Wang, J Li, B Li, Y Lu - arxiv preprint arxiv:2302.05071, 2023 - arxiv.org
Neural image compression has surpassed state-of-the-art traditional codecs (H. 266/VVC)
for rate-distortion (RD) performance, but suffers from large complexity and separate models …

A review of artificial intelligence in embedded systems

Z Zhang, J Li - Micromachines, 2023 - mdpi.com
Advancements in artificial intelligence algorithms and models, along with embedded device
support, have resulted in the issue of high energy consumption and poor compatibility when …