Knowledge distillation on graphs: A survey

Y Tian, S Pei, X Zhang, C Zhang, N Chawla - ACM Computing Surveys, 2023 - dl.acm.org
Graph Neural Networks (GNNs) have received significant attention for demonstrating their
capability to handle graph data. However, they are difficult to be deployed in resource …

A survey of safety and trustworthiness of large language models through the lens of verification and validation

X Huang, W Ruan, W Huang, G **, Y Dong… - Artificial Intelligence …, 2024 - Springer
Large language models (LLMs) have exploded a new heatwave of AI for their ability to
engage end-users in human-level conversations with detailed and articulate answers across …

Online speculative decoding

X Liu, L Hu, P Bailis, A Cheung, Z Deng, I Stoica… - arxiv preprint arxiv …, 2023 - arxiv.org
Speculative decoding is a pivotal technique to accelerate the inference of large language
models (LLMs) by employing a smaller draft model to predict the target model's outputs …

Sed: Semantic-aware discriminator for image super-resolution

B Li, X Li, H Zhu, Y **, R Feng… - Proceedings of the …, 2024 - openaccess.thecvf.com
Abstract Generative Adversarial Networks (GANs) have been widely used to recover vivid
textures in image super-resolution (SR) tasks. In particular one discriminator is utilized to …

Dafkd: Domain-aware federated knowledge distillation

H Wang, Y Li, W Xu, R Li, Y Zhan… - Proceedings of the …, 2023 - openaccess.thecvf.com
Federated Distillation (FD) has recently attracted increasing attention for its efficiency in
aggregating multiple diverse local models trained from statistically heterogeneous data of …

D3still: Decoupled Differential Distillation for Asymmetric Image Retrieval

Y **e, Y Lin, W Cai, X Xu, H Zhang… - Proceedings of the …, 2024 - openaccess.thecvf.com
Existing methods for asymmetric image retrieval employ a rigid pairwise similarity constraint
between the query network and the larger gallery network. However these one-to-one …

Towards a smaller student: Capacity dynamic distillation for efficient image retrieval

Y **e, H Zhang, X Xu, J Zhu… - 2023 IEEE/CVF …, 2023 - ieeexplore.ieee.org
Previous Knowledge Distillation based efficient image retrieval methods employ a
lightweight network as the stu-dent model for fast inference. However, the lightweight stu …

Skill-transferring knowledge distillation method

S Yang, L Xu, M Zhou, X Yang, J Yang… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Knowledge distillation is a deep learning method that mimics the way that humans teach, ie,
a teacher network is used to guide the training of a student one. Knowledge distillation can …

Learning from human educational wisdom: A student-centered knowledge distillation method

S Yang, J Yang, MC Zhou, Z Huang… - … on Pattern Analysis …, 2024 - ieeexplore.ieee.org
Existing studies on knowledge distillation typically focus on teacher-centered methods, in
which the teacher network is trained according to its own standards before transferring the …

Acceleration algorithms in gnns: A survey

L Ma, Z Sheng, X Li, X Gao, Z Hao, L Yang… - arxiv preprint arxiv …, 2024 - arxiv.org
Graph Neural Networks (GNNs) have demonstrated effectiveness in various graph-based
tasks. However, their inefficiency in training and inference presents challenges for scaling …