NTIRE 2023 challenge on efficient super-resolution: Methods and results
This paper reviews the NTIRE 2023 challenge on efficient single-image super-resolution
with a focus on the proposed solutions and results. The aim of this challenge is to devise a …
with a focus on the proposed solutions and results. The aim of this challenge is to devise a …
Structured pruning for deep convolutional neural networks: A survey
The remarkable performance of deep Convolutional neural networks (CNNs) is generally
attributed to their deeper and wider architectures, which can come with significant …
attributed to their deeper and wider architectures, which can come with significant …
Review of lightweight deep convolutional neural networks
F Chen, S Li, J Han, F Ren, Z Yang - Archives of Computational Methods …, 2024 - Springer
Lightweight deep convolutional neural networks (LDCNNs) are vital components of mobile
intelligence, particularly in mobile vision. Although various heavy networks with increasingly …
intelligence, particularly in mobile vision. Although various heavy networks with increasingly …
Loraprune: Pruning meets low-rank parameter-efficient fine-tuning
Large pre-trained models (LPMs), such as LLaMA and GLM, have shown exceptional
performance across various tasks through fine-tuning. Although low-rank adaption (LoRA) …
performance across various tasks through fine-tuning. Although low-rank adaption (LoRA) …
Pyra: Parallel yielding re-activation for training-inference efficient task adaptation
Recently, the scale of transformers has grown rapidly, which introduces considerable
challenges in terms of training overhead and inference efficiency in the scope of task …
challenges in terms of training overhead and inference efficiency in the scope of task …
Structural alignment for network pruning through partial regularization
In this paper, we propose a novel channel pruning method to reduce the computational and
storage costs of Convolutional Neural Networks (CNNs). Many existing one-shot pruning …
storage costs of Convolutional Neural Networks (CNNs). Many existing one-shot pruning …
Automatic network pruning via hilbert-schmidt independence criterion lasso under information bottleneck principle
Most existing neural network pruning methods hand-crafted their importance criteria and
structures to prune. This constructs heavy and unintended dependencies on heuristics and …
structures to prune. This constructs heavy and unintended dependencies on heuristics and …
Differentiable transportation pruning
Deep learning algorithms are increasingly employed at the edge. However, edge devices
are resource constrained and thus require efficient deployment of deep neural networks …
are resource constrained and thus require efficient deployment of deep neural networks …
Evc: Towards real-time neural image compression with mask decay
Neural image compression has surpassed state-of-the-art traditional codecs (H. 266/VVC)
for rate-distortion (RD) performance, but suffers from large complexity and separate models …
for rate-distortion (RD) performance, but suffers from large complexity and separate models …
A review of artificial intelligence in embedded systems
Z Zhang, J Li - Micromachines, 2023 - mdpi.com
Advancements in artificial intelligence algorithms and models, along with embedded device
support, have resulted in the issue of high energy consumption and poor compatibility when …
support, have resulted in the issue of high energy consumption and poor compatibility when …