Self-cooperation knowledge distillation for novel class discovery
Abstract Novel Class Discovery (NCD) aims to discover unknown and novel classes in an
unlabeled set by leveraging knowledge already learned about known classes. Existing …
unlabeled set by leveraging knowledge already learned about known classes. Existing …
Sampling to distill: Knowledge transfer from open-world data
Data-Free Knowledge Distillation (DFKD) is a novel task that aims to train high-performance
student models using only the pre-trained teacher network without original training data …
student models using only the pre-trained teacher network without original training data …
Unlocking Tuning-Free Few-Shot Adaptability in Visual Foundation Models by Recycling Pre-Tuned LoRAs
Large Language Models (LLMs) such as ChatGPT demonstrate strong few-shot adaptability
without requiring fine-tuning, positioning them ideal for data-limited and real-time …
without requiring fine-tuning, positioning them ideal for data-limited and real-time …
Hybrid Data-Free Knowledge Distillation
J Tang, S Chen, C Gong - arxiv preprint arxiv:2412.13525, 2024 - arxiv.org
Data-free knowledge distillation aims to learn a compact student network from a pre-trained
large teacher network without using the original training data of the teacher network. Existing …
large teacher network without using the original training data of the teacher network. Existing …