A comprehensive survey on deep graph representation learning
Graph representation learning aims to effectively encode high-dimensional sparse graph-
structured data into low-dimensional dense vectors, which is a fundamental task that has …
structured data into low-dimensional dense vectors, which is a fundamental task that has …
Multimodal learning with graphs
Artificial intelligence for graphs has achieved remarkable success in modelling complex
systems, ranging from dynamic networks in biology to interacting particle systems in physics …
systems, ranging from dynamic networks in biology to interacting particle systems in physics …
Recipe for a general, powerful, scalable graph transformer
We propose a recipe on how to build a general, powerful, scalable (GPS) graph Transformer
with linear complexity and state-of-the-art results on a diverse set of benchmarks. Graph …
with linear complexity and state-of-the-art results on a diverse set of benchmarks. Graph …
Do transformers really perform badly for graph representation?
The Transformer architecture has become a dominant choice in many domains, such as
natural language processing and computer vision. Yet, it has not achieved competitive …
natural language processing and computer vision. Yet, it has not achieved competitive …
Structure-aware transformer for graph representation learning
The Transformer architecture has gained growing attention in graph representation learning
recently, as it naturally overcomes several limitations of graph neural networks (GNNs) by …
recently, as it naturally overcomes several limitations of graph neural networks (GNNs) by …
Benchmarking graph neural networks
In the last few years, graph neural networks (GNNs) have become the standard toolkit for
analyzing and learning from data on graphs. This emerging field has witnessed an extensive …
analyzing and learning from data on graphs. This emerging field has witnessed an extensive …
Long range graph benchmark
Abstract Graph Neural Networks (GNNs) that are based on the message passing (MP)
paradigm generally exchange information between 1-hop neighbors to build node …
paradigm generally exchange information between 1-hop neighbors to build node …
Pure transformers are powerful graph learners
We show that standard Transformers without graph-specific modifications can lead to
promising results in graph learning both in theory and practice. Given a graph, we simply …
promising results in graph learning both in theory and practice. Given a graph, we simply …
Temporal graph benchmark for machine learning on temporal graphs
Abstract We present the Temporal Graph Benchmark (TGB), a collection of challenging and
diverse benchmark datasets for realistic, reproducible, and robust evaluation of machine …
diverse benchmark datasets for realistic, reproducible, and robust evaluation of machine …
On over-squashing in message passing neural networks: The impact of width, depth, and topology
Abstract Message Passing Neural Networks (MPNNs) are instances of Graph Neural
Networks that leverage the graph to send messages over the edges. This inductive bias …
Networks that leverage the graph to send messages over the edges. This inductive bias …