An overview of neural network compression
JO Neill - arxiv preprint arxiv:2006.03669, 2020 - arxiv.org
Overparameterized networks trained to convergence have shown impressive performance
in domains such as computer vision and natural language processing. Pushing state of the …
in domains such as computer vision and natural language processing. Pushing state of the …
Group sparsity: The hinge between filter pruning and decomposition for network compression
In this paper, we analyze two popular network compression techniques, ie filter pruning and
low-rank decomposition, in a unified sense. By simply changing the way the sparsity …
low-rank decomposition, in a unified sense. By simply changing the way the sparsity …
Deep learning in systems medicine
Abstract Systems medicine (SM) has emerged as a powerful tool for studying the human
body at the systems level with the aim of improving our understanding, prevention and …
body at the systems level with the aim of improving our understanding, prevention and …
Structural alignment for network pruning through partial regularization
In this paper, we propose a novel channel pruning method to reduce the computational and
storage costs of Convolutional Neural Networks (CNNs). Many existing one-shot pruning …
storage costs of Convolutional Neural Networks (CNNs). Many existing one-shot pruning …
Collaborative channel pruning for deep networks
Deep networks have achieved impressive performance in various domains, but their
applications are largely limited by the prohibitive computational overhead. In this paper, we …
applications are largely limited by the prohibitive computational overhead. In this paper, we …
Layer-wise training convolutional neural networks with smaller filters for human activity recognition using wearable sensors
Recently, convolutional neural networks (CNNs) have set latest state-of-the-art on various
human activity recognition (HAR) datasets. However, deep CNNs often require more …
human activity recognition (HAR) datasets. However, deep CNNs often require more …
Fast oscar and owl regression via safe screening rules
Abstract Ordered Weighted $ L_ {1} $(OWL) regularized regression is a new regression
analysis for high-dimensional sparse learning. Proximal gradient methods are used as …
analysis for high-dimensional sparse learning. Proximal gradient methods are used as …
Edp: An efficient decomposition and pruning scheme for convolutional neural network compression
Model compression methods have become popular in recent years, which aim to alleviate
the heavy load of deep neural networks (DNNs) in real-world applications. However, most of …
the heavy load of deep neural networks (DNNs) in real-world applications. However, most of …
OICSR: Out-in-channel sparsity regularization for compact deep neural networks
Channel pruning can significantly accelerate and compress deep neural networks. Many
channel pruning works utilize structured sparsity regularization to zero out all the weights in …
channel pruning works utilize structured sparsity regularization to zero out all the weights in …
Spatial-temporal federated learning for lifelong person re-identification on distributed edges
Data drift is a thorny challenge when deploying person re-identification (ReID) models into
real-world devices, where the data distribution is significantly different from that of the …
real-world devices, where the data distribution is significantly different from that of the …