Knowledge distillation on graphs: A survey
Graph Neural Networks (GNNs) have received significant attention for demonstrating their
capability to handle graph data. However, they are difficult to be deployed in resource …
capability to handle graph data. However, they are difficult to be deployed in resource …
A survey of safety and trustworthiness of large language models through the lens of verification and validation
Large language models (LLMs) have exploded a new heatwave of AI for their ability to
engage end-users in human-level conversations with detailed and articulate answers across …
engage end-users in human-level conversations with detailed and articulate answers across …
Online speculative decoding
Speculative decoding is a pivotal technique to accelerate the inference of large language
models (LLMs) by employing a smaller draft model to predict the target model's outputs …
models (LLMs) by employing a smaller draft model to predict the target model's outputs …
Sed: Semantic-aware discriminator for image super-resolution
Abstract Generative Adversarial Networks (GANs) have been widely used to recover vivid
textures in image super-resolution (SR) tasks. In particular one discriminator is utilized to …
textures in image super-resolution (SR) tasks. In particular one discriminator is utilized to …
Dafkd: Domain-aware federated knowledge distillation
Federated Distillation (FD) has recently attracted increasing attention for its efficiency in
aggregating multiple diverse local models trained from statistically heterogeneous data of …
aggregating multiple diverse local models trained from statistically heterogeneous data of …
D3still: Decoupled Differential Distillation for Asymmetric Image Retrieval
Existing methods for asymmetric image retrieval employ a rigid pairwise similarity constraint
between the query network and the larger gallery network. However these one-to-one …
between the query network and the larger gallery network. However these one-to-one …
Towards a smaller student: Capacity dynamic distillation for efficient image retrieval
Previous Knowledge Distillation based efficient image retrieval methods employ a
lightweight network as the stu-dent model for fast inference. However, the lightweight stu …
lightweight network as the stu-dent model for fast inference. However, the lightweight stu …
Skill-transferring knowledge distillation method
Knowledge distillation is a deep learning method that mimics the way that humans teach, ie,
a teacher network is used to guide the training of a student one. Knowledge distillation can …
a teacher network is used to guide the training of a student one. Knowledge distillation can …
Learning from human educational wisdom: A student-centered knowledge distillation method
Existing studies on knowledge distillation typically focus on teacher-centered methods, in
which the teacher network is trained according to its own standards before transferring the …
which the teacher network is trained according to its own standards before transferring the …
Acceleration algorithms in gnns: A survey
Graph Neural Networks (GNNs) have demonstrated effectiveness in various graph-based
tasks. However, their inefficiency in training and inference presents challenges for scaling …
tasks. However, their inefficiency in training and inference presents challenges for scaling …