Transformer for graphs: An overview from architecture perspective

E Min, R Chen, Y Bian, T Xu, K Zhao, W Huang… - arxiv preprint arxiv …, 2022 - arxiv.org
Recently, Transformer model, which has achieved great success in many artificial
intelligence fields, has demonstrated its great potential in modeling graph-structured data …

Accelerating the integration of ChatGPT and other large‐scale AI models into biomedical research and healthcare

DQ Wang, LY Feng, JG Ye, JG Zou… - MedComm–Future …, 2023 - Wiley Online Library
Large‐scale artificial intelligence (AI) models such as ChatGPT have the potential to
improve performance on many benchmarks and real‐world tasks. However, it is difficult to …

Recipe for a general, powerful, scalable graph transformer

L Rampášek, M Galkin, VP Dwivedi… - Advances in …, 2022 - proceedings.neurips.cc
We propose a recipe on how to build a general, powerful, scalable (GPS) graph Transformer
with linear complexity and state-of-the-art results on a diverse set of benchmarks. Graph …

On over-squashing in message passing neural networks: The impact of width, depth, and topology

F Di Giovanni, L Giusti, F Barbero… - International …, 2023 - proceedings.mlr.press
Abstract Message Passing Neural Networks (MPNNs) are instances of Graph Neural
Networks that leverage the graph to send messages over the edges. This inductive bias …

Long range graph benchmark

VP Dwivedi, L Rampášek, M Galkin… - Advances in …, 2022 - proceedings.neurips.cc
Abstract Graph Neural Networks (GNNs) that are based on the message passing (MP)
paradigm generally exchange information between 1-hop neighbors to build node …

Structure-aware transformer for graph representation learning

D Chen, L O'Bray, K Borgwardt - … Conference on Machine …, 2022 - proceedings.mlr.press
The Transformer architecture has gained growing attention in graph representation learning
recently, as it naturally overcomes several limitations of graph neural networks (GNNs) by …

Graph inductive biases in transformers without message passing

L Ma, C Lin, D Lim, A Romero-Soriano… - International …, 2023 - proceedings.mlr.press
Transformers for graph data are increasingly widely studied and successful in numerous
learning tasks. Graph inductive biases are crucial for Graph Transformers, and previous …

Exphormer: Sparse transformers for graphs

H Shirzad, A Velingker… - International …, 2023 - proceedings.mlr.press
Graph transformers have emerged as a promising architecture for a variety of graph learning
and representation tasks. Despite their successes, though, it remains challenging to scale …

Benchmarking graph neural networks

VP Dwivedi, CK Joshi, AT Luu, T Laurent… - Journal of Machine …, 2023 - jmlr.org
In the last few years, graph neural networks (GNNs) have become the standard toolkit for
analyzing and learning from data on graphs. This emerging field has witnessed an extensive …

A generalization of vit/mlp-mixer to graphs

X He, B Hooi, T Laurent, A Perold… - International …, 2023 - proceedings.mlr.press
Abstract Graph Neural Networks (GNNs) have shown great potential in the field of graph
representation learning. Standard GNNs define a local message-passing mechanism which …