Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Teacher-student architecture for knowledge distillation: A survey
Although Deep neural networks (DNNs) have shown a strong capacity to solve large-scale
problems in many areas, such DNNs are hard to be deployed in real-world systems due to …
problems in many areas, such DNNs are hard to be deployed in real-world systems due to …
Contrastive self-supervised learning in recommender systems: A survey
Deep learning-based recommender systems have achieved remarkable success in recent
years. However, these methods usually heavily rely on labeled data (ie, user-item …
years. However, these methods usually heavily rely on labeled data (ie, user-item …
Dynamic sparse learning: A novel paradigm for efficient recommendation
In the realm of deep learning-based recommendation systems, the increasing computational
demands, driven by the growing number of users and items, pose a significant challenge to …
demands, driven by the growing number of users and items, pose a significant challenge to …
Causal recommendation: Progresses and future directions
Data-driven recommender systems have demonstrated great success in various Web
applications owing to the extraordinary ability of machine learning models to recognize …
applications owing to the extraordinary ability of machine learning models to recognize …
Distillation matters: empowering sequential recommenders to match the performance of large language models
Owing to their powerful semantic reasoning capabilities, Large Language Models (LLMs)
have been effectively utilized as recommenders, achieving impressive performance …
have been effectively utilized as recommenders, achieving impressive performance …
Invariant debiasing learning for recommendation via biased imputation
Previous debiasing studies utilize unbiased data to make supervision of model training.
They suffer from the high trial risks and experimental costs to obtain unbiased data. Recent …
They suffer from the high trial risks and experimental costs to obtain unbiased data. Recent …
Multi-Modal Knowledge Distillation for Recommendation with Prompt-Tuning
Multimedia online platforms, such as Amazon and TikTok, have greatly benefited from the
incorporation of multimedia content (eg, visual, textual, and acoustic modalities), into their …
incorporation of multimedia content (eg, visual, textual, and acoustic modalities), into their …
Toward Cross-Lingual Social Event Detection with Hybrid Knowledge Distillation
Recently published graph neural networks (GNNs) show promising performance at social
event detection tasks. However, most studies are oriented toward monolingual data in …
event detection tasks. However, most studies are oriented toward monolingual data in …
Rd-suite: A benchmark for ranking distillation
The distillation of ranking models has become an important topic in both academia and
industry. In recent years, several advanced methods have been proposed to tackle this …
industry. In recent years, several advanced methods have been proposed to tackle this …
Unbiased, Effective, and Efficient Distillation from Heterogeneous Models for Recommender Systems
In recent years, recommender systems have achieved remarkable performance by using
ensembles of heterogeneous models. However, this approach is costly due to the resources …
ensembles of heterogeneous models. However, this approach is costly due to the resources …