The evolution of distributed systems for graph neural networks and their origin in graph processing and deep learning: A survey

J Vatter, R Mayer, HA Jacobsen - ACM Computing Surveys, 2023 - dl.acm.org
Graph neural networks (GNNs) are an emerging research field. This specialized deep
neural network architecture is capable of processing graph structured data and bridges the …

Scientific large language models: A survey on biological & chemical domains

Q Zhang, K Ding, T Lv, X Wang, Q Yin, Y Zhang… - ACM Computing …, 2024 - dl.acm.org
Large Language Models (LLMs) have emerged as a transformative power in enhancing
natural language comprehension, representing a significant stride toward artificial general …

Recipe for a general, powerful, scalable graph transformer

L Rampášek, M Galkin, VP Dwivedi… - Advances in …, 2022 - proceedings.neurips.cc
We propose a recipe on how to build a general, powerful, scalable (GPS) graph Transformer
with linear complexity and state-of-the-art results on a diverse set of benchmarks. Graph …

Temporal graph benchmark for machine learning on temporal graphs

S Huang, F Poursafaei, J Danovitch… - Advances in …, 2024 - proceedings.neurips.cc
Abstract We present the Temporal Graph Benchmark (TGB), a collection of challenging and
diverse benchmark datasets for realistic, reproducible, and robust evaluation of machine …

Long range graph benchmark

VP Dwivedi, L Rampášek, M Galkin… - Advances in …, 2022 - proceedings.neurips.cc
Abstract Graph Neural Networks (GNNs) that are based on the message passing (MP)
paradigm generally exchange information between 1-hop neighbors to build node …

Do transformers really perform badly for graph representation?

C Ying, T Cai, S Luo, S Zheng, G Ke… - Advances in neural …, 2021 - proceedings.neurips.cc
The Transformer architecture has become a dominant choice in many domains, such as
natural language processing and computer vision. Yet, it has not achieved competitive …

Graph inductive biases in transformers without message passing

L Ma, C Lin, D Lim, A Romero-Soriano… - International …, 2023 - proceedings.mlr.press
Transformers for graph data are increasingly widely studied and successful in numerous
learning tasks. Graph inductive biases are crucial for Graph Transformers, and previous …

Drew: Dynamically rewired message passing with delay

B Gutteridge, X Dong, MM Bronstein… - International …, 2023 - proceedings.mlr.press
Message passing neural networks (MPNNs) have been shown to suffer from the
phenomenon of over-squashing that causes poor performance for tasks relying on long …

Equiformer: Equivariant graph attention transformer for 3d atomistic graphs

YL Liao, T Smidt - arxiv preprint arxiv:2206.11990, 2022 - arxiv.org
Despite their widespread success in various domains, Transformer networks have yet to
perform well across datasets in the domain of 3D atomistic graphs such as molecules even …

Neural bellman-ford networks: A general graph neural network framework for link prediction

Z Zhu, Z Zhang, LP Xhonneux… - Advances in Neural …, 2021 - proceedings.neurips.cc
Link prediction is a very fundamental task on graphs. Inspired by traditional path-based
methods, in this paper we propose a general and flexible representation learning framework …