Teacher-student architecture for knowledge distillation: A survey

C Hu, X Li, D Liu, H Wu, X Chen, J Wang… - arxiv preprint arxiv …, 2023 - arxiv.org
Although Deep neural networks (DNNs) have shown a strong capacity to solve large-scale
problems in many areas, such DNNs are hard to be deployed in real-world systems due to …

Contrastive self-supervised learning in recommender systems: A survey

M **g, Y Zhu, T Zang, K Wang - ACM Transactions on Information …, 2023 - dl.acm.org
Deep learning-based recommender systems have achieved remarkable success in recent
years. However, these methods usually heavily rely on labeled data (ie, user-item …

Dynamic sparse learning: A novel paradigm for efficient recommendation

S Wang, Y Sui, J Wu, Z Zheng, H **ong - Proceedings of the 17th ACM …, 2024 - dl.acm.org
In the realm of deep learning-based recommendation systems, the increasing computational
demands, driven by the growing number of users and items, pose a significant challenge to …

Causal recommendation: Progresses and future directions

W Wang, Y Zhang, H Li, P Wu, F Feng… - Proceedings of the 46th …, 2023 - dl.acm.org
Data-driven recommender systems have demonstrated great success in various Web
applications owing to the extraordinary ability of machine learning models to recognize …

Distillation matters: empowering sequential recommenders to match the performance of large language models

Y Cui, F Liu, P Wang, B Wang, H Tang, Y Wan… - Proceedings of the 18th …, 2024 - dl.acm.org
Owing to their powerful semantic reasoning capabilities, Large Language Models (LLMs)
have been effectively utilized as recommenders, achieving impressive performance …

Invariant debiasing learning for recommendation via biased imputation

T Bai, W Chen, C Yang, C Shi - Information Processing & Management, 2025 - Elsevier
Previous debiasing studies utilize unbiased data to make supervision of model training.
They suffer from the high trial risks and experimental costs to obtain unbiased data. Recent …

Multi-Modal Knowledge Distillation for Recommendation with Prompt-Tuning

W Wei, J Tang, L **a, Y Jiang… - The Web Conference 2024, 2024 - openreview.net
Multimedia online platforms, such as Amazon and TikTok, have greatly benefited from the
incorporation of multimedia content (eg, visual, textual, and acoustic modalities), into their …

Toward Cross-Lingual Social Event Detection with Hybrid Knowledge Distillation

J Ren, H Peng, L Jiang, Z Hao, J Wu, S Gao… - ACM Transactions on …, 2024 - dl.acm.org
Recently published graph neural networks (GNNs) show promising performance at social
event detection tasks. However, most studies are oriented toward monolingual data in …

Rd-suite: A benchmark for ranking distillation

Z Qin, R Jagerman, RK Pasumarthi… - Advances in …, 2023 - proceedings.neurips.cc
The distillation of ranking models has become an important topic in both academia and
industry. In recent years, several advanced methods have been proposed to tackle this …

Unbiased, Effective, and Efficient Distillation from Heterogeneous Models for Recommender Systems

SK Kang, W Kweon, D Lee, J Lian, X **e… - ACM Transactions on …, 2024 - dl.acm.org
In recent years, recommender systems have achieved remarkable performance by using
ensembles of heterogeneous models. However, this approach is costly due to the resources …