Deep epidemiological modeling by black-box knowledge distillation: an accurate deep learning model for COVID-19
An accurate and efficient forecasting system is imperative to the prevention of emerging
infectious diseases such as COVID-19 in public health. This system requires accurate …
infectious diseases such as COVID-19 in public health. This system requires accurate …
UDON: Universal Dynamic Online distillatioN for generic image representations
Universal image representations are critical in enabling real-world fine-grained and instance-
level recognition applications, where objects and entities from any domain must be identified …
level recognition applications, where objects and entities from any domain must be identified …
StableKD: Breaking Inter-block Optimization Entanglement for Stable Knowledge Distillation
Knowledge distillation (KD) has been recognized as an effective tool to compress and
accelerate models. However, current KD approaches generally suffer from an accuracy drop …
accelerate models. However, current KD approaches generally suffer from an accuracy drop …
S2SD: simultaneous similarity-based self-distillation for deep metric learning
Deep Metric Learning (DML) provides a crucial tool for visual similarity and zero-shot
applications by learning generalizing embedding spaces, although recent work in DML has …
applications by learning generalizing embedding spaces, although recent work in DML has …
Improving Deep Neural Network Training with Knowledge Distillation
D Wang - 2023 - stars.library.ucf.edu
Abstract Knowledge distillation, as a popular compression technique, has been widely used
to reduce deep neural network (DNN) size for a variety of applications. However, in recent …
to reduce deep neural network (DNN) size for a variety of applications. However, in recent …