Knowledge distillation with the reused teacher classifier
Abstract Knowledge distillation aims to compress a powerful yet cumbersome teacher model
into a lightweight student model without much sacrifice of performance. For this purpose …
into a lightweight student model without much sacrifice of performance. For this purpose …
Data-free knowledge distillation via feature exchange and activation region constraint
Despite the tremendous progress on data-free knowledge distillation (DFKD) based on
synthetic data generation, there are still limitations in diverse and efficient data synthesis. It …
synthetic data generation, there are still limitations in diverse and efficient data synthesis. It …
Up to 100x faster data-free knowledge distillation
Data-free knowledge distillation (DFKD) has recently been attracting increasing attention
from research communities, attributed to its capability to compress a model only using …
from research communities, attributed to its capability to compress a model only using …
Vkd: Improving knowledge distillation using orthogonal projections
Abstract Knowledge distillation is an effective method for training small and efficient deep
learning models. However the efficacy of a single method can degenerate when transferring …
learning models. However the efficacy of a single method can degenerate when transferring …
Learning to retain while acquiring: Combating distribution-shift in adversarial data-free knowledge distillation
Abstract Data-free Knowledge Distillation (DFKD) has gained popularity recently, with the
fundamental idea of carrying out knowledge transfer from a Teacher neural network to a …
fundamental idea of carrying out knowledge transfer from a Teacher neural network to a …
Dfrd: Data-free robustness distillation for heterogeneous federated learning
Federated Learning (FL) is a privacy-constrained decentralized machine learning paradigm
in which clients enable collaborative training without compromising private data. However …
in which clients enable collaborative training without compromising private data. However …
Small scale data-free knowledge distillation
Data-free knowledge distillation is able to utilize the knowledge learned by a large teacher
network to augment the training of a smaller student network without accessing the original …
network to augment the training of a smaller student network without accessing the original …
Unpacking the gap box against data-free knowledge distillation
Data-free knowledge distillation (DFKD) improves the student model (S) by mimicking the
class probability from a pre-trained teacher model (T) without training data. Under such …
class probability from a pre-trained teacher model (T) without training data. Under such …
Data-free sketch-based image retrieval
Rising concerns about privacy and anonymity preservation of deep learning models have
facilitated research in data-free learning. Primarily based on data-free knowledge distillation …
facilitated research in data-free learning. Primarily based on data-free knowledge distillation …
Privacy leakage on dnns: A survey of model inversion attacks and defenses
Deep Neural Networks (DNNs) have revolutionized various domains with their exceptional
performance across numerous applications. However, Model Inversion (MI) attacks, which …
performance across numerous applications. However, Model Inversion (MI) attacks, which …