Recipe for a general, powerful, scalable graph transformer

L Rampášek, M Galkin, VP Dwivedi… - Advances in …, 2022 - proceedings.neurips.cc
We propose a recipe on how to build a general, powerful, scalable (GPS) graph Transformer
with linear complexity and state-of-the-art results on a diverse set of benchmarks. Graph …

Graph inductive biases in transformers without message passing

L Ma, C Lin, D Lim, A Romero-Soriano… - International …, 2023 - proceedings.mlr.press
Transformers for graph data are increasingly widely studied and successful in numerous
learning tasks. Graph inductive biases are crucial for Graph Transformers, and previous …

Understanding and extending subgraph gnns by rethinking their symmetries

F Frasca, B Bevilacqua… - Advances in Neural …, 2022 - proceedings.neurips.cc
Subgraph GNNs are a recent class of expressive Graph Neural Networks (GNNs) which
model graphs as collections of subgraphs. So far, the design space of possible Subgraph …

Exphormer: Sparse transformers for graphs

H Shirzad, A Velingker… - International …, 2023 - proceedings.mlr.press
Graph transformers have emerged as a promising architecture for a variety of graph learning
and representation tasks. Despite their successes, though, it remains challenging to scale …

Universal prompt tuning for graph neural networks

T Fang, Y Zhang, Y Yang, C Wang… - Advances in Neural …, 2023 - proceedings.neurips.cc
In recent years, prompt tuning has sparked a research surge in adapting pre-trained models.
Unlike the unified pre-training strategy employed in the language field, the graph field …

A generalization of vit/mlp-mixer to graphs

X He, B Hooi, T Laurent, A Perold… - International …, 2023 - proceedings.mlr.press
Abstract Graph Neural Networks (GNNs) have shown great potential in the field of graph
representation learning. Standard GNNs define a local message-passing mechanism which …

How powerful are k-hop message passing graph neural networks

J Feng, Y Chen, F Li, A Sarkar… - Advances in Neural …, 2022 - proceedings.neurips.cc
The most popular design paradigm for Graph Neural Networks (GNNs) is 1-hop message
passing---aggregating information from 1-hop neighbors repeatedly. However, the …

Rethinking the expressive power of gnns via graph biconnectivity

B Zhang, S Luo, L Wang, D He - arxiv preprint arxiv:2301.09505, 2023 - arxiv.org
Designing expressive Graph Neural Networks (GNNs) is a central topic in learning graph-
structured data. While numerous approaches have been proposed to improve GNNs in …

Weisfeiler and leman go machine learning: The story so far

C Morris, Y Lipman, H Maron, B Rieck… - The Journal of Machine …, 2023 - dl.acm.org
In recent years, algorithms and neural architectures based on the Weisfeiler-Leman
algorithm, a well-known heuristic for the graph isomorphism problem, have emerged as a …

Substructure aware graph neural networks

D Zeng, W Liu, W Chen, L Zhou, M Zhang… - Proceedings of the AAAI …, 2023 - ojs.aaai.org
Despite the great achievements of Graph Neural Networks (GNNs) in graph learning,
conventional GNNs struggle to break through the upper limit of the expressiveness of first …