Transformer for graphs: An overview from architecture perspective

E Min, R Chen, Y Bian, T Xu, K Zhao, W Huang… - arxiv preprint arxiv …, 2022 - arxiv.org
Recently, Transformer model, which has achieved great success in many artificial
intelligence fields, has demonstrated its great potential in modeling graph-structured data …

Predicting protein–ligand docking structure with graph neural network

H Jiang, J Wang, W Cong, Y Huang… - Journal of chemical …, 2022 - ACS Publications
Modern day drug discovery is extremely expensive and time consuming. Although
computational approaches help accelerate and decrease the cost of drug discovery, existing …

Structure-aware transformer for graph representation learning

D Chen, L O'Bray, K Borgwardt - … Conference on Machine …, 2022 - proceedings.mlr.press
The Transformer architecture has gained growing attention in graph representation learning
recently, as it naturally overcomes several limitations of graph neural networks (GNNs) by …

Accurate prediction of protein structures and interactions using a three-track neural network

M Baek, F DiMaio, I Anishchenko, J Dauparas… - Science, 2021 - science.org
DeepMind presented notably accurate predictions at the recent 14th Critical Assessment of
Structure Prediction (CASP14) conference. We explored network architectures that …

Do transformers really perform badly for graph representation?

C Ying, T Cai, S Luo, S Zheng, G Ke… - Advances in neural …, 2021 - proceedings.neurips.cc
The Transformer architecture has become a dominant choice in many domains, such as
natural language processing and computer vision. Yet, it has not achieved competitive …

How attentive are graph attention networks?

S Brody, U Alon, E Yahav - arxiv preprint arxiv:2105.14491, 2021 - arxiv.org
Graph Attention Networks (GATs) are one of the most popular GNN architectures and are
considered as the state-of-the-art architecture for representation learning with graphs. In …

Equiformer: Equivariant graph attention transformer for 3d atomistic graphs

YL Liao, T Smidt - arxiv preprint arxiv:2206.11990, 2022 - arxiv.org
Despite their widespread success in various domains, Transformer networks have yet to
perform well across datasets in the domain of 3D atomistic graphs such as molecules even …

Ogb-lsc: A large-scale challenge for machine learning on graphs

W Hu, M Fey, H Ren, M Nakata, Y Dong… - arxiv preprint arxiv …, 2021 - arxiv.org
Enabling effective and efficient machine learning (ML) over large-scale graph data (eg,
graphs with billions of edges) can have a great impact on both industrial and scientific …

Training graph neural networks with 1000 layers

G Li, M Müller, B Ghanem… - … conference on machine …, 2021 - proceedings.mlr.press
Deep graph neural networks (GNNs) have achieved excellent results on various tasks on
increasingly large graph datasets with millions of nodes and edges. However, memory …

[PDF][PDF] Natural language is all a graph needs

R Ye, C Zhang, R Wang, S Xu, Y Zhang - arxiv preprint arxiv …, 2023 - yongfeng.me
The emergence of large-scale pre-trained language models, such as ChatGPT, has
revolutionized various research fields in artificial intelligence. Transformersbased large …