Channel Self-Supervision for Online Knowledge Distillation
S Fan, X Cheng, X Wang, C Yang, P Deng… - arxiv preprint arxiv …, 2022 - arxiv.org
Recently, researchers have shown an increased interest in the online knowledge distillation.
Adopting an one-stage and end-to-end training fashion, online knowledge distillation uses …
Adopting an one-stage and end-to-end training fashion, online knowledge distillation uses …