Transformer for graphs: An overview from architecture perspective
Recently, Transformer model, which has achieved great success in many artificial
intelligence fields, has demonstrated its great potential in modeling graph-structured data …
intelligence fields, has demonstrated its great potential in modeling graph-structured data …
Predicting protein–ligand docking structure with graph neural network
Modern day drug discovery is extremely expensive and time consuming. Although
computational approaches help accelerate and decrease the cost of drug discovery, existing …
computational approaches help accelerate and decrease the cost of drug discovery, existing …
Structure-aware transformer for graph representation learning
The Transformer architecture has gained growing attention in graph representation learning
recently, as it naturally overcomes several limitations of graph neural networks (GNNs) by …
recently, as it naturally overcomes several limitations of graph neural networks (GNNs) by …
Accurate prediction of protein structures and interactions using a three-track neural network
DeepMind presented notably accurate predictions at the recent 14th Critical Assessment of
Structure Prediction (CASP14) conference. We explored network architectures that …
Structure Prediction (CASP14) conference. We explored network architectures that …
Do transformers really perform badly for graph representation?
The Transformer architecture has become a dominant choice in many domains, such as
natural language processing and computer vision. Yet, it has not achieved competitive …
natural language processing and computer vision. Yet, it has not achieved competitive …
How attentive are graph attention networks?
Graph Attention Networks (GATs) are one of the most popular GNN architectures and are
considered as the state-of-the-art architecture for representation learning with graphs. In …
considered as the state-of-the-art architecture for representation learning with graphs. In …
Equiformer: Equivariant graph attention transformer for 3d atomistic graphs
YL Liao, T Smidt - arxiv preprint arxiv:2206.11990, 2022 - arxiv.org
Despite their widespread success in various domains, Transformer networks have yet to
perform well across datasets in the domain of 3D atomistic graphs such as molecules even …
perform well across datasets in the domain of 3D atomistic graphs such as molecules even …
Ogb-lsc: A large-scale challenge for machine learning on graphs
Enabling effective and efficient machine learning (ML) over large-scale graph data (eg,
graphs with billions of edges) can have a great impact on both industrial and scientific …
graphs with billions of edges) can have a great impact on both industrial and scientific …
Training graph neural networks with 1000 layers
Deep graph neural networks (GNNs) have achieved excellent results on various tasks on
increasingly large graph datasets with millions of nodes and edges. However, memory …
increasingly large graph datasets with millions of nodes and edges. However, memory …
[PDF][PDF] Natural language is all a graph needs
The emergence of large-scale pre-trained language models, such as ChatGPT, has
revolutionized various research fields in artificial intelligence. Transformersbased large …
revolutionized various research fields in artificial intelligence. Transformersbased large …