Deep epidemiological modeling by black-box knowledge distillation: an accurate deep learning model for COVID-19

D Wang, S Zhang, L Wang - Proceedings of the AAAI Conference on …, 2021 - ojs.aaai.org
An accurate and efficient forecasting system is imperative to the prevention of emerging
infectious diseases such as COVID-19 in public health. This system requires accurate …

UDON: Universal Dynamic Online distillatioN for generic image representations

NA Ypsilantis, K Chen, A Araujo, O Chum - arxiv preprint arxiv …, 2024 - arxiv.org
Universal image representations are critical in enabling real-world fine-grained and instance-
level recognition applications, where objects and entities from any domain must be identified …

StableKD: Breaking Inter-block Optimization Entanglement for Stable Knowledge Distillation

S Kao, J Chen, SH Chan - arxiv preprint arxiv:2312.13223, 2023 - arxiv.org
Knowledge distillation (KD) has been recognized as an effective tool to compress and
accelerate models. However, current KD approaches generally suffer from an accuracy drop …

S2SD: simultaneous similarity-based self-distillation for deep metric learning

K Roth, T Milbich, B Ommer, JP Cohen… - arxiv preprint arxiv …, 2020 - arxiv.org
Deep Metric Learning (DML) provides a crucial tool for visual similarity and zero-shot
applications by learning generalizing embedding spaces, although recent work in DML has …

Improving Deep Neural Network Training with Knowledge Distillation

D Wang - 2023 - stars.library.ucf.edu
Abstract Knowledge distillation, as a popular compression technique, has been widely used
to reduce deep neural network (DNN) size for a variety of applications. However, in recent …