An overview of neural network compression

JO Neill - arxiv preprint arxiv:2006.03669, 2020 - arxiv.org
Overparameterized networks trained to convergence have shown impressive performance
in domains such as computer vision and natural language processing. Pushing state of the …

Group sparsity: The hinge between filter pruning and decomposition for network compression

Y Li, S Gu, C Mayer, LV Gool… - Proceedings of the …, 2020 - openaccess.thecvf.com
In this paper, we analyze two popular network compression techniques, ie filter pruning and
low-rank decomposition, in a unified sense. By simply changing the way the sparsity …

Deep learning in systems medicine

H Wang, E Pujos-Guillot, B Comte… - Briefings in …, 2021 - academic.oup.com
Abstract Systems medicine (SM) has emerged as a powerful tool for studying the human
body at the systems level with the aim of improving our understanding, prevention and …

Structural alignment for network pruning through partial regularization

S Gao, Z Zhang, Y Zhang, F Huang… - Proceedings of the …, 2023 - openaccess.thecvf.com
In this paper, we propose a novel channel pruning method to reduce the computational and
storage costs of Convolutional Neural Networks (CNNs). Many existing one-shot pruning …

Collaborative channel pruning for deep networks

H Peng, J Wu, S Chen, J Huang - … conference on machine …, 2019 - proceedings.mlr.press
Deep networks have achieved impressive performance in various domains, but their
applications are largely limited by the prohibitive computational overhead. In this paper, we …

Layer-wise training convolutional neural networks with smaller filters for human activity recognition using wearable sensors

Y Tang, Q Teng, L Zhang, F Min, J He - IEEE Sensors Journal, 2020 - ieeexplore.ieee.org
Recently, convolutional neural networks (CNNs) have set latest state-of-the-art on various
human activity recognition (HAR) datasets. However, deep CNNs often require more …

Fast oscar and owl regression via safe screening rules

R Bao, B Gu, H Huang - International conference on …, 2020 - proceedings.mlr.press
Abstract Ordered Weighted $ L_ {1} $(OWL) regularized regression is a new regression
analysis for high-dimensional sparse learning. Proximal gradient methods are used as …

Edp: An efficient decomposition and pruning scheme for convolutional neural network compression

X Ruan, Y Liu, C Yuan, B Li, W Hu, Y Li… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org
Model compression methods have become popular in recent years, which aim to alleviate
the heavy load of deep neural networks (DNNs) in real-world applications. However, most of …

OICSR: Out-in-channel sparsity regularization for compact deep neural networks

J Li, Q Qi, J Wang, C Ge, Y Li… - Proceedings of the …, 2019 - openaccess.thecvf.com
Channel pruning can significantly accelerate and compress deep neural networks. Many
channel pruning works utilize structured sparsity regularization to zero out all the weights in …

Spatial-temporal federated learning for lifelong person re-identification on distributed edges

L Zhang, G Gao, H Zhang - … on Circuits and Systems for Video …, 2023 - ieeexplore.ieee.org
Data drift is a thorny challenge when deploying person re-identification (ReID) models into
real-world devices, where the data distribution is significantly different from that of the …