Recipe for a general, powerful, scalable graph transformer
We propose a recipe on how to build a general, powerful, scalable (GPS) graph Transformer
with linear complexity and state-of-the-art results on a diverse set of benchmarks. Graph …
with linear complexity and state-of-the-art results on a diverse set of benchmarks. Graph …
Graph inductive biases in transformers without message passing
Transformers for graph data are increasingly widely studied and successful in numerous
learning tasks. Graph inductive biases are crucial for Graph Transformers, and previous …
learning tasks. Graph inductive biases are crucial for Graph Transformers, and previous …
Understanding and extending subgraph gnns by rethinking their symmetries
Subgraph GNNs are a recent class of expressive Graph Neural Networks (GNNs) which
model graphs as collections of subgraphs. So far, the design space of possible Subgraph …
model graphs as collections of subgraphs. So far, the design space of possible Subgraph …
Exphormer: Sparse transformers for graphs
Graph transformers have emerged as a promising architecture for a variety of graph learning
and representation tasks. Despite their successes, though, it remains challenging to scale …
and representation tasks. Despite their successes, though, it remains challenging to scale …
Universal prompt tuning for graph neural networks
In recent years, prompt tuning has sparked a research surge in adapting pre-trained models.
Unlike the unified pre-training strategy employed in the language field, the graph field …
Unlike the unified pre-training strategy employed in the language field, the graph field …
A generalization of vit/mlp-mixer to graphs
Abstract Graph Neural Networks (GNNs) have shown great potential in the field of graph
representation learning. Standard GNNs define a local message-passing mechanism which …
representation learning. Standard GNNs define a local message-passing mechanism which …
How powerful are k-hop message passing graph neural networks
The most popular design paradigm for Graph Neural Networks (GNNs) is 1-hop message
passing---aggregating information from 1-hop neighbors repeatedly. However, the …
passing---aggregating information from 1-hop neighbors repeatedly. However, the …
Rethinking the expressive power of gnns via graph biconnectivity
Designing expressive Graph Neural Networks (GNNs) is a central topic in learning graph-
structured data. While numerous approaches have been proposed to improve GNNs in …
structured data. While numerous approaches have been proposed to improve GNNs in …
Weisfeiler and leman go machine learning: The story so far
In recent years, algorithms and neural architectures based on the Weisfeiler-Leman
algorithm, a well-known heuristic for the graph isomorphism problem, have emerged as a …
algorithm, a well-known heuristic for the graph isomorphism problem, have emerged as a …
Substructure aware graph neural networks
Despite the great achievements of Graph Neural Networks (GNNs) in graph learning,
conventional GNNs struggle to break through the upper limit of the expressiveness of first …
conventional GNNs struggle to break through the upper limit of the expressiveness of first …