Large language models on graphs: A comprehensive survey
Large language models (LLMs), such as GPT4 and LLaMA, are creating significant
advancements in natural language processing, due to their strong text encoding/decoding …
advancements in natural language processing, due to their strong text encoding/decoding …
A comprehensive survey on trustworthy recommender systems
As one of the most successful AI-powered applications, recommender systems aim to help
people make appropriate decisions in an effective and efficient way, by providing …
people make appropriate decisions in an effective and efficient way, by providing …
Graphtext: Graph reasoning in text space
Large Language Models (LLMs) have gained the ability to assimilate human knowledge and
facilitate natural language interactions with both humans and other LLMs. However, despite …
facilitate natural language interactions with both humans and other LLMs. However, despite …
Graphformers: Gnn-nested transformers for representation learning on textual graph
The representation learning on textual graph is to generate low-dimensional embeddings for
the nodes based on the individual textual features and the neighbourhood information …
the nodes based on the individual textual features and the neighbourhood information …
Efficiently leveraging multi-level user intent for session-based recommendation via atten-mixer network
Session-based recommendation (SBR) aims to predict the user's next action based on short
and dynamic sessions. Recently, there has been an increasing interest in utilizing various …
and dynamic sessions. Recently, there has been an increasing interest in utilizing various …
Can GNN be Good Adapter for LLMs?
Recently, large language models (LLMs) have demonstrated superior capabilities in
understanding and zero-shot learning on textual data, promising significant advances for …
understanding and zero-shot learning on textual data, promising significant advances for …
A comprehensive study on text-attributed graphs: Benchmarking and rethinking
Text-attributed graphs (TAGs) are prevalent in various real-world scenarios, where each
node is associated with a text description. The cornerstone of representation learning on …
node is associated with a text description. The cornerstone of representation learning on …
AdaMCT: adaptive mixture of CNN-transformer for sequential recommendation
Sequential recommendation (SR) aims to model users' dynamic preferences from a series of
interactions. A pivotal challenge in user modeling for SR lies in the inherent variability of …
interactions. A pivotal challenge in user modeling for SR lies in the inherent variability of …
Edgeformers: Graph-empowered transformers for representation learning on textual-edge networks
Edges in many real-world social/information networks are associated with rich text
information (eg, user-user communications or user-product reviews). However, mainstream …
information (eg, user-user communications or user-product reviews). However, mainstream …
Heterformer: Transformer-based deep node representation learning on heterogeneous text-rich networks
Representation learning on networks aims to derive a meaningful vector representation for
each node, thereby facilitating downstream tasks such as link prediction, node classification …
each node, thereby facilitating downstream tasks such as link prediction, node classification …