Large language models on graphs: A comprehensive survey

B **, G Liu, C Han, M Jiang, H Ji… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Large language models (LLMs), such as GPT4 and LLaMA, are creating significant
advancements in natural language processing, due to their strong text encoding/decoding …

A comprehensive survey on trustworthy recommender systems

W Fan, X Zhao, X Chen, J Su, J Gao, L Wang… - arxiv preprint arxiv …, 2022 - arxiv.org
As one of the most successful AI-powered applications, recommender systems aim to help
people make appropriate decisions in an effective and efficient way, by providing …

Graphtext: Graph reasoning in text space

J Zhao, L Zhuo, Y Shen, M Qu, K Liu… - arxiv preprint arxiv …, 2023 - arxiv.org
Large Language Models (LLMs) have gained the ability to assimilate human knowledge and
facilitate natural language interactions with both humans and other LLMs. However, despite …

Graphformers: Gnn-nested transformers for representation learning on textual graph

J Yang, Z Liu, S **ao, C Li, D Lian… - Advances in …, 2021 - proceedings.neurips.cc
The representation learning on textual graph is to generate low-dimensional embeddings for
the nodes based on the individual textual features and the neighbourhood information …

Efficiently leveraging multi-level user intent for session-based recommendation via atten-mixer network

P Zhang, J Guo, C Li, Y **e, JB Kim, Y Zhang… - Proceedings of the …, 2023 - dl.acm.org
Session-based recommendation (SBR) aims to predict the user's next action based on short
and dynamic sessions. Recently, there has been an increasing interest in utilizing various …

Can GNN be Good Adapter for LLMs?

X Huang, K Han, Y Yang, D Bao, Q Tao… - Proceedings of the …, 2024 - dl.acm.org
Recently, large language models (LLMs) have demonstrated superior capabilities in
understanding and zero-shot learning on textual data, promising significant advances for …

A comprehensive study on text-attributed graphs: Benchmarking and rethinking

H Yan, C Li, R Long, C Yan, J Zhao… - Advances in …, 2023 - proceedings.neurips.cc
Text-attributed graphs (TAGs) are prevalent in various real-world scenarios, where each
node is associated with a text description. The cornerstone of representation learning on …

AdaMCT: adaptive mixture of CNN-transformer for sequential recommendation

J Jiang, P Zhang, Y Luo, C Li, JB Kim, K Zhang… - Proceedings of the …, 2023 - dl.acm.org
Sequential recommendation (SR) aims to model users' dynamic preferences from a series of
interactions. A pivotal challenge in user modeling for SR lies in the inherent variability of …

Edgeformers: Graph-empowered transformers for representation learning on textual-edge networks

B **, Y Zhang, Y Meng, J Han - arxiv preprint arxiv:2302.11050, 2023 - arxiv.org
Edges in many real-world social/information networks are associated with rich text
information (eg, user-user communications or user-product reviews). However, mainstream …

Heterformer: Transformer-based deep node representation learning on heterogeneous text-rich networks

B **, Y Zhang, Q Zhu, J Han - Proceedings of the 29th ACM SIGKDD …, 2023 - dl.acm.org
Representation learning on networks aims to derive a meaningful vector representation for
each node, thereby facilitating downstream tasks such as link prediction, node classification …