Self-cooperation knowledge distillation for novel class discovery

Y Wang, Z Chen, D Yang, Y Sun, L Qi - European Conference on …, 2024 - Springer
Abstract Novel Class Discovery (NCD) aims to discover unknown and novel classes in an
unlabeled set by leveraging knowledge already learned about known classes. Existing …

Sampling to distill: Knowledge transfer from open-world data

Y Wang, Z Chen, J Zhang, D Yang, Z Ge, Y Liu… - Proceedings of the …, 2024 - dl.acm.org
Data-Free Knowledge Distillation (DFKD) is a novel task that aims to train high-performance
student models using only the pre-trained teacher network without original training data …

Unlocking Tuning-Free Few-Shot Adaptability in Visual Foundation Models by Recycling Pre-Tuned LoRAs

Z Hu, Y Wei, L Shen, C Yuan, D Tao - arxiv preprint arxiv:2412.02220, 2024 - arxiv.org
Large Language Models (LLMs) such as ChatGPT demonstrate strong few-shot adaptability
without requiring fine-tuning, positioning them ideal for data-limited and real-time …

Hybrid Data-Free Knowledge Distillation

J Tang, S Chen, C Gong - arxiv preprint arxiv:2412.13525, 2024 - arxiv.org
Data-free knowledge distillation aims to learn a compact student network from a pre-trained
large teacher network without using the original training data of the teacher network. Existing …