A survey on deep neural network pruning: Taxonomy, comparison, analysis, and recommendations

H Cheng, M Zhang, JQ Shi - IEEE Transactions on Pattern …, 2024 - ieeexplore.ieee.org
Modern deep neural networks, particularly recent large language models, come with
massive model sizes that require significant computational and storage resources. To …

Exploring the landscape of machine unlearning: A comprehensive survey and taxonomy

T Shaik, X Tao, H **e, L Li, X Zhu… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Machine unlearning (MU) is gaining increasing attention due to the need to remove or
modify predictions made by machine learning (ML) models. While training models have …

Model sparsity can simplify machine unlearning

J Jia, J Liu, P Ram, Y Yao, G Liu, Y Liu… - Advances in …, 2023 - proceedings.neurips.cc
In response to recent data regulation requirements, machine unlearning (MU) has emerged
as a critical process to remove the influence of specific examples from a given model …

Spvit: Enabling faster vision transformers via latency-aware soft token pruning

Z Kong, P Dong, X Ma, X Meng, W Niu, M Sun… - European conference on …, 2022 - Springer
Abstract Recently, Vision Transformer (ViT) has continuously established new milestones in
the computer vision field, while the high computation and memory cost makes its …

Chex: Channel exploration for cnn model compression

Z Hou, M Qin, F Sun, X Ma, K Yuan… - Proceedings of the …, 2022 - openaccess.thecvf.com
Channel pruning has been broadly recognized as an effective technique to reduce the
computation and memory cost of deep convolutional neural networks. However …

Federated dynamic sparse training: Computing less, communicating less, yet learning better

S Bibikar, H Vikalo, Z Wang, X Chen - Proceedings of the AAAI …, 2022 - ojs.aaai.org
Federated learning (FL) enables distribution of machine learning workloads from the cloud
to resource-limited edge devices. Unfortunately, current deep networks remain not only too …

Advancing model pruning via bi-level optimization

Y Zhang, Y Yao, P Ram, P Zhao… - Advances in …, 2022 - proceedings.neurips.cc
The deployment constraints in practical applications necessitate the pruning of large-scale
deep learning models, ie, promoting their weight sparsity. As illustrated by the Lottery Ticket …

An introduction to bilevel optimization: Foundations and applications in signal processing and machine learning

Y Zhang, P Khanduri, I Tsaknakis, Y Yao… - IEEE Signal …, 2024 - ieeexplore.ieee.org
Recently, bilevel optimization (BLO) has taken center stage in some very exciting
developments in the area of signal processing (SP) and machine learning (ML). Roughly …

What makes unlearning hard and what to do about it

K Zhao, M Kurmanji, GO Bărbulescu… - Advances in …, 2025 - proceedings.neurips.cc
Abstract Machine unlearning is the problem of removing the effect of a subset of training
data (the``forget set'') from a trained model without damaging the model's utility eg to comply …

Recent advances on neural network pruning at initialization

H Wang, C Qin, Y Bai, Y Zhang, Y Fu - arxiv preprint arxiv:2103.06460, 2021 - arxiv.org
Neural network pruning typically removes connections or neurons from a pretrained
converged model; while a new pruning paradigm, pruning at initialization (PaI), attempts to …