Everything is connected: Graph neural networks
P Veličković - Current Opinion in Structural Biology, 2023 - Elsevier
In many ways, graphs are the main modality of data we receive from nature. This is due to
the fact that most of the patterns we see, both in natural and artificial systems, are elegantly …
the fact that most of the patterns we see, both in natural and artificial systems, are elegantly …
Graph neural networks for materials science and chemistry
Abstract Machine learning plays an increasingly important role in many areas of chemistry
and materials science, being used to predict materials properties, accelerate simulations …
and materials science, being used to predict materials properties, accelerate simulations …
Recipe for a general, powerful, scalable graph transformer
We propose a recipe on how to build a general, powerful, scalable (GPS) graph Transformer
with linear complexity and state-of-the-art results on a diverse set of benchmarks. Graph …
with linear complexity and state-of-the-art results on a diverse set of benchmarks. Graph …
Nodeformer: A scalable graph structure learning transformer for node classification
Graph neural networks have been extensively studied for learning with inter-connected data.
Despite this, recent evidence has revealed GNNs' deficiencies related to over-squashing …
Despite this, recent evidence has revealed GNNs' deficiencies related to over-squashing …
On over-squashing in message passing neural networks: The impact of width, depth, and topology
Abstract Message Passing Neural Networks (MPNNs) are instances of Graph Neural
Networks that leverage the graph to send messages over the edges. This inductive bias …
Networks that leverage the graph to send messages over the edges. This inductive bias …
Long range graph benchmark
Abstract Graph Neural Networks (GNNs) that are based on the message passing (MP)
paradigm generally exchange information between 1-hop neighbors to build node …
paradigm generally exchange information between 1-hop neighbors to build node …
Structure-aware transformer for graph representation learning
D Chen, L O'Bray, K Borgwardt - … Conference on Machine …, 2022 - proceedings.mlr.press
The Transformer architecture has gained growing attention in graph representation learning
recently, as it naturally overcomes several limitations of graph neural networks (GNNs) by …
recently, as it naturally overcomes several limitations of graph neural networks (GNNs) by …
A survey on oversmoothing in graph neural networks
Node features of graph neural networks (GNNs) tend to become more similar with the
increase of the network depth. This effect is known as over-smoothing, which we …
increase of the network depth. This effect is known as over-smoothing, which we …
Graph inductive biases in transformers without message passing
Transformers for graph data are increasingly widely studied and successful in numerous
learning tasks. Graph inductive biases are crucial for Graph Transformers, and previous …
learning tasks. Graph inductive biases are crucial for Graph Transformers, and previous …
How attentive are graph attention networks?
Graph Attention Networks (GATs) are one of the most popular GNN architectures and are
considered as the state-of-the-art architecture for representation learning with graphs. In …
considered as the state-of-the-art architecture for representation learning with graphs. In …