Retrieval-augmented generation with graphs (graphrag)

H Han, Y Wang, H Shomer, K Guo, J Ding, Y Lei… - arxiv preprint arxiv …, 2024 - arxiv.org
Retrieval-augmented generation (RAG) is a powerful technique that enhances downstream
task execution by retrieving additional information, such as knowledge, skills, and tools from …

Learning Efficient Positional Encodings with Graph Neural Networks

CI Kanatsoulis, E Choi, S Jegelka, J Leskovec… - arxiv preprint arxiv …, 2025 - arxiv.org
Positional encodings (PEs) are essential for effective graph representation learning because
they provide position awareness in inherently position-agnostic transformer architectures …

ContextGNN: Beyond Two-Tower Recommendation Systems

Y Yuan, Z Zhang, X He, A Nitta, W Hu, D Wang… - arxiv preprint arxiv …, 2024 - arxiv.org
Recommendation systems predominantly utilize two-tower architectures, which evaluate
user-item rankings through the inner product of their respective embeddings. However, one …

Transformers Meet Relational Databases

J Peleška, G Šír - arxiv preprint arxiv:2412.05218, 2024 - arxiv.org
Transformer models have continuously expanded into all machine learning domains
convertible to the underlying sequence-to-sequence representation, including tabular data …

Hypergraph Neural Networks with Logic Clauses

JPG de Souza, G Zaverucha… - 2024 International Joint …, 2024 - ieeexplore.ieee.org
The analysis of structure in complex datasets has become essential to solving difficult
Machine Learning problems. Relational aspects of data, capturing relationships between …

Tackling prediction tasks in relational databases with LLMs

M Wydmuch, Ł Borchmann, F Graliński - arxiv preprint arxiv:2411.11829, 2024 - arxiv.org
Though large language models (LLMs) have demonstrated exceptional performance across
numerous problems, their application to predictive tasks in relational databases remains …