Knowledge distillation with the reused teacher classifier

D Chen, JP Mei, H Zhang, C Wang… - Proceedings of the …, 2022 - openaccess.thecvf.com
Abstract Knowledge distillation aims to compress a powerful yet cumbersome teacher model
into a lightweight student model without much sacrifice of performance. For this purpose …

Data-free knowledge distillation via feature exchange and activation region constraint

S Yu, J Chen, H Han, S Jiang - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
Despite the tremendous progress on data-free knowledge distillation (DFKD) based on
synthetic data generation, there are still limitations in diverse and efficient data synthesis. It …

Up to 100x faster data-free knowledge distillation

G Fang, K Mo, X Wang, J Song, S Bei… - Proceedings of the …, 2022 - ojs.aaai.org
Data-free knowledge distillation (DFKD) has recently been attracting increasing attention
from research communities, attributed to its capability to compress a model only using …

Vkd: Improving knowledge distillation using orthogonal projections

R Miles, I Elezi, J Deng - … of the IEEE/CVF Conference on …, 2024 - openaccess.thecvf.com
Abstract Knowledge distillation is an effective method for training small and efficient deep
learning models. However the efficacy of a single method can degenerate when transferring …

Learning to retain while acquiring: Combating distribution-shift in adversarial data-free knowledge distillation

G Patel, KR Mopuri, Q Qiu - … of the IEEE/CVF Conference on …, 2023 - openaccess.thecvf.com
Abstract Data-free Knowledge Distillation (DFKD) has gained popularity recently, with the
fundamental idea of carrying out knowledge transfer from a Teacher neural network to a …

Dfrd: Data-free robustness distillation for heterogeneous federated learning

S Wang, Y Fu, X Li, Y Lan… - Advances in Neural …, 2023 - proceedings.neurips.cc
Federated Learning (FL) is a privacy-constrained decentralized machine learning paradigm
in which clients enable collaborative training without compromising private data. However …

Small scale data-free knowledge distillation

H Liu, Y Wang, H Liu, F Sun… - Proceedings of the IEEE …, 2024 - openaccess.thecvf.com
Data-free knowledge distillation is able to utilize the knowledge learned by a large teacher
network to augment the training of a smaller student network without accessing the original …

Unpacking the gap box against data-free knowledge distillation

Y Wang, B Qian, H Liu, Y Rui… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Data-free knowledge distillation (DFKD) improves the student model (S) by mimicking the
class probability from a pre-trained teacher model (T) without training data. Under such …

Data-free sketch-based image retrieval

A Chaudhuri, AK Bhunia, YZ Song… - Proceedings of the …, 2023 - openaccess.thecvf.com
Rising concerns about privacy and anonymity preservation of deep learning models have
facilitated research in data-free learning. Primarily based on data-free knowledge distillation …

Privacy leakage on dnns: A survey of model inversion attacks and defenses

H Fang, Y Qiu, H Yu, W Yu, J Kong, B Chong… - arxiv preprint arxiv …, 2024 - arxiv.org
Deep Neural Networks (DNNs) have revolutionized various domains with their exceptional
performance across numerous applications. However, Model Inversion (MI) attacks, which …