Ai alignment: A comprehensive survey

J Ji, T Qiu, B Chen, B Zhang, H Lou, K Wang… - arxiv preprint arxiv …, 2023 - arxiv.org
AI alignment aims to make AI systems behave in line with human intentions and values. As
AI systems grow more capable, so do risks from misalignment. To provide a comprehensive …

Explainable recommendation: A survey and new perspectives

Y Zhang, X Chen - Foundations and Trends® in Information …, 2020 - nowpublishers.com
Explainable recommendation attempts to develop models that generate not only high-quality
recommendations but also intuitive explanations. The explanations may either be post-hoc …

Aligning distillation for cold-start item recommendation

F Huang, Z Wang, X Huang, Y Qian, Z Li… - Proceedings of the 46th …, 2023 - dl.acm.org
Recommending cold items in recommendation systems is a longstanding challenge due to
the inherent differences between warm items, which are recommended based on user …

A general knowledge distillation framework for counterfactual recommendation via uniform data

D Liu, P Cheng, Z Dong, X He, W Pan… - Proceedings of the 43rd …, 2020 - dl.acm.org
Recommender systems are feedback loop systems, which often face bias problems such as
popularity bias, previous model bias and position bias. In this paper, we focus on solving the …

An explainable recommendation framework based on an improved knowledge graph attention network with massive volumes of side information

R Shimizu, M Matsutani, M Goto - Knowledge-Based Systems, 2022 - Elsevier
In recent years, explainable recommendation has been a topic of active study. This is
because the branch of the machine learning field related to methodologies is enabling …

Comprehensible artificial intelligence on knowledge graphs: A survey

S Schramm, C Wehner, U Schmid - Journal of Web Semantics, 2023 - Elsevier
Artificial Intelligence applications gradually move outside the safe walls of research labs and
invade our daily lives. This is also true for Machine Learning methods on Knowledge …

Unbiased knowledge distillation for recommendation

G Chen, J Chen, F Feng, S Zhou, X He - … on web search and data mining, 2023 - dl.acm.org
As a promising solution for model compression, knowledge distillation (KD) has been
applied in recommender systems (RS) to reduce inference latency. Traditional solutions first …

A Revisiting Study of Appropriate Offline Evaluation for Top-N Recommendation Algorithms

WX Zhao, Z Lin, Z Feng, P Wang, JR Wen - ACM Transactions on …, 2022 - dl.acm.org
In recommender systems, top-N recommendation is an important task with implicit feedback
data. Although the recent success of deep learning largely pushes forward the research on …

The datasets dilemma: How much do we really know about recommendation datasets?

JY Chin, Y Chen, G Cong - … Conference on Web Search and Data …, 2022 - dl.acm.org
There has been sustained interest from both academia and industry throughout the years
due to the importance and practicability of recommendation systems. However, several …

Ensembled CTR prediction via knowledge distillation

J Zhu, J Liu, W Li, J Lai, X He, L Chen… - Proceedings of the 29th …, 2020 - dl.acm.org
Recently, deep learning-based models have been widely studied for click-through rate
(CTR) prediction and lead to improved prediction accuracy in many industrial applications …