Transformer for graphs: An overview from architecture perspective
Recently, Transformer model, which has achieved great success in many artificial
intelligence fields, has demonstrated its great potential in modeling graph-structured data …
intelligence fields, has demonstrated its great potential in modeling graph-structured data …
Accelerating the integration of ChatGPT and other large‐scale AI models into biomedical research and healthcare
DQ Wang, LY Feng, JG Ye, JG Zou… - MedComm–Future …, 2023 - Wiley Online Library
Large‐scale artificial intelligence (AI) models such as ChatGPT have the potential to
improve performance on many benchmarks and real‐world tasks. However, it is difficult to …
improve performance on many benchmarks and real‐world tasks. However, it is difficult to …
Recipe for a general, powerful, scalable graph transformer
We propose a recipe on how to build a general, powerful, scalable (GPS) graph Transformer
with linear complexity and state-of-the-art results on a diverse set of benchmarks. Graph …
with linear complexity and state-of-the-art results on a diverse set of benchmarks. Graph …
On over-squashing in message passing neural networks: The impact of width, depth, and topology
Abstract Message Passing Neural Networks (MPNNs) are instances of Graph Neural
Networks that leverage the graph to send messages over the edges. This inductive bias …
Networks that leverage the graph to send messages over the edges. This inductive bias …
Long range graph benchmark
Abstract Graph Neural Networks (GNNs) that are based on the message passing (MP)
paradigm generally exchange information between 1-hop neighbors to build node …
paradigm generally exchange information between 1-hop neighbors to build node …
Structure-aware transformer for graph representation learning
The Transformer architecture has gained growing attention in graph representation learning
recently, as it naturally overcomes several limitations of graph neural networks (GNNs) by …
recently, as it naturally overcomes several limitations of graph neural networks (GNNs) by …
Graph inductive biases in transformers without message passing
Transformers for graph data are increasingly widely studied and successful in numerous
learning tasks. Graph inductive biases are crucial for Graph Transformers, and previous …
learning tasks. Graph inductive biases are crucial for Graph Transformers, and previous …
Exphormer: Sparse transformers for graphs
Graph transformers have emerged as a promising architecture for a variety of graph learning
and representation tasks. Despite their successes, though, it remains challenging to scale …
and representation tasks. Despite their successes, though, it remains challenging to scale …
Benchmarking graph neural networks
In the last few years, graph neural networks (GNNs) have become the standard toolkit for
analyzing and learning from data on graphs. This emerging field has witnessed an extensive …
analyzing and learning from data on graphs. This emerging field has witnessed an extensive …
A generalization of vit/mlp-mixer to graphs
Abstract Graph Neural Networks (GNNs) have shown great potential in the field of graph
representation learning. Standard GNNs define a local message-passing mechanism which …
representation learning. Standard GNNs define a local message-passing mechanism which …