The evolution of distributed systems for graph neural networks and their origin in graph processing and deep learning: A survey
Graph neural networks (GNNs) are an emerging research field. This specialized deep
neural network architecture is capable of processing graph structured data and bridges the …
neural network architecture is capable of processing graph structured data and bridges the …
Scientific large language models: A survey on biological & chemical domains
Large Language Models (LLMs) have emerged as a transformative power in enhancing
natural language comprehension, representing a significant stride toward artificial general …
natural language comprehension, representing a significant stride toward artificial general …
Recipe for a general, powerful, scalable graph transformer
We propose a recipe on how to build a general, powerful, scalable (GPS) graph Transformer
with linear complexity and state-of-the-art results on a diverse set of benchmarks. Graph …
with linear complexity and state-of-the-art results on a diverse set of benchmarks. Graph …
Temporal graph benchmark for machine learning on temporal graphs
Abstract We present the Temporal Graph Benchmark (TGB), a collection of challenging and
diverse benchmark datasets for realistic, reproducible, and robust evaluation of machine …
diverse benchmark datasets for realistic, reproducible, and robust evaluation of machine …
Long range graph benchmark
Abstract Graph Neural Networks (GNNs) that are based on the message passing (MP)
paradigm generally exchange information between 1-hop neighbors to build node …
paradigm generally exchange information between 1-hop neighbors to build node …
Do transformers really perform badly for graph representation?
The Transformer architecture has become a dominant choice in many domains, such as
natural language processing and computer vision. Yet, it has not achieved competitive …
natural language processing and computer vision. Yet, it has not achieved competitive …
Graph inductive biases in transformers without message passing
Transformers for graph data are increasingly widely studied and successful in numerous
learning tasks. Graph inductive biases are crucial for Graph Transformers, and previous …
learning tasks. Graph inductive biases are crucial for Graph Transformers, and previous …
Drew: Dynamically rewired message passing with delay
Message passing neural networks (MPNNs) have been shown to suffer from the
phenomenon of over-squashing that causes poor performance for tasks relying on long …
phenomenon of over-squashing that causes poor performance for tasks relying on long …
Equiformer: Equivariant graph attention transformer for 3d atomistic graphs
YL Liao, T Smidt - arxiv preprint arxiv:2206.11990, 2022 - arxiv.org
Despite their widespread success in various domains, Transformer networks have yet to
perform well across datasets in the domain of 3D atomistic graphs such as molecules even …
perform well across datasets in the domain of 3D atomistic graphs such as molecules even …
Neural bellman-ford networks: A general graph neural network framework for link prediction
Link prediction is a very fundamental task on graphs. Inspired by traditional path-based
methods, in this paper we propose a general and flexible representation learning framework …
methods, in this paper we propose a general and flexible representation learning framework …