[HTML][HTML] Data augmentation: A comprehensive survey of modern approaches

A Mumuni, F Mumuni - Array, 2022‏ - Elsevier
To ensure good performance, modern machine learning models typically require large
amounts of quality annotated data. Meanwhile, the data collection and annotation processes …

A survey on deep neural network pruning: Taxonomy, comparison, analysis, and recommendations

H Cheng, M Zhang, JQ Shi - IEEE Transactions on Pattern …, 2024‏ - ieeexplore.ieee.org
Modern deep neural networks, particularly recent large language models, come with
massive model sizes that require significant computational and storage resources. To …

Structured pruning for deep convolutional neural networks: A survey

Y He, L **ao - IEEE transactions on pattern analysis and …, 2023‏ - ieeexplore.ieee.org
The remarkable performance of deep Convolutional neural networks (CNNs) is generally
attributed to their deeper and wider architectures, which can come with significant …

A review of convolutional neural network architectures and their optimizations

S Cong, Y Zhou - Artificial Intelligence Review, 2023‏ - Springer
The research advances concerning the typical architectures of convolutional neural
networks (CNNs) as well as their optimizations are analyzed and elaborated in detail in this …

Transformer in transformer

K Han, A **ao, E Wu, J Guo, C Xu… - Advances in neural …, 2021‏ - proceedings.neurips.cc
Transformer is a new kind of neural architecture which encodes the input data as powerful
features via the attention mechanism. Basically, the visual transformers first divide the input …

Distilling object detectors via decoupled features

J Guo, K Han, Y Wang, H Wu… - Proceedings of the …, 2021‏ - openaccess.thecvf.com
Abstract Knowledge distillation is a widely used paradigm for inheriting information from a
complicated teacher network to a compact student network and maintaining the strong …

Patch slimming for efficient vision transformers

Y Tang, K Han, Y Wang, C Xu, J Guo… - Proceedings of the …, 2022‏ - openaccess.thecvf.com
This paper studies the efficiency problem for visual transformers by excavating redundant
calculation in given networks. The recent transformer architecture has demonstrated its …

Chip: Channel independence-based pruning for compact neural networks

Y Sui, M Yin, Y **e, H Phan… - Advances in Neural …, 2021‏ - proceedings.neurips.cc
Filter pruning has been widely used for neural network compression because of its enabled
practical acceleration. To date, most of the existing filter pruning works explore the …

A review of artificial intelligence in embedded systems

Z Zhang, J Li - Micromachines, 2023‏ - mdpi.com
Advancements in artificial intelligence algorithms and models, along with embedded device
support, have resulted in the issue of high energy consumption and poor compatibility when …

Sparser spiking activity can be better: Feature refine-and-mask spiking neural network for event-based visual recognition

M Yao, H Zhang, G Zhao, X Zhang, D Wang, G Cao… - Neural Networks, 2023‏ - Elsevier
Event-based visual, a new visual paradigm with bio-inspired dynamic perception and μ s
level temporal resolution, has prominent advantages in many specific visual scenarios and …