Language is all a graph needs

R Ye, C Zhang, R Wang, S Xu, Y Zhang - arxiv preprint arxiv:2308.07134, 2023 - arxiv.org
The emergence of large-scale pre-trained language models has revolutionized various AI
research domains. Transformers-based Large Language Models (LLMs) have gradually …

NAGphormer: A tokenized graph transformer for node classification in large graphs

J Chen, K Gao, G Li, K He - arxiv preprint arxiv:2206.04910, 2022 - arxiv.org
The graph Transformer emerges as a new architecture and has shown superior
performance on various graph mining tasks. In this work, we observe that existing graph …

Pre-training via denoising for molecular property prediction

S Zaidi, M Schaarschmidt, J Martens, H Kim… - arxiv preprint arxiv …, 2022 - arxiv.org
Many important problems involving molecular property prediction from 3D structures have
limited data, posing a generalization challenge for neural networks. In this paper, we …

On the connection between mpnn and graph transformer

C Cai, TS Hy, R Yu, Y Wang - International conference on …, 2023 - proceedings.mlr.press
Graph Transformer (GT) recently has emerged as a new paradigm of graph learning
algorithms, outperforming the previously popular Message Passing Neural Network (MPNN) …

Act as you wish: Fine-grained control of motion diffusion model with hierarchical semantic graphs

P **, Y Wu, Y Fan, Z Sun, W Yang… - Advances in Neural …, 2023 - proceedings.neurips.cc
Most text-driven human motion generation methods employ sequential modeling
approaches, eg, transformer, to extract sentence-level text representations automatically and …

Graphnorm: A principled approach to accelerating graph neural network training

T Cai, S Luo, K Xu, D He, T Liu… - … Conference on Machine …, 2021 - proceedings.mlr.press
Normalization is known to help the optimization of deep neural networks. Curiously, different
architectures require specialized normalization methods. In this paper, we study what …

Simple gnn regularisation for 3d molecular property prediction & beyond

J Godwin, M Schaarschmidt, A Gaunt… - arxiv preprint arxiv …, 2021 - arxiv.org
In this paper we show that simple noise regularisation can be an effective way to address
GNN oversmoothing. First we argue that regularisers addressing oversmoothing should both …

On provable benefits of depth in training graph convolutional networks

W Cong, M Ramezani… - Advances in Neural …, 2021 - proceedings.neurips.cc
Abstract Graph Convolutional Networks (GCNs) are known to suffer from performance
degradation as the number of layers increases, which is usually attributed to over …

Quantifying the knowledge in gnns for reliable distillation into mlps

L Wu, H Lin, Y Huang, SZ Li - International Conference on …, 2023 - proceedings.mlr.press
To bridge the gaps between topology-aware Graph Neural Networks (GNNs) and inference-
efficient Multi-Layer Perceptron (MLPs), GLNN proposes to distill knowledge from a well …

Multi-label text classification based on semantic-sensitive graph convolutional network

D Zeng, E Zha, J Kuang, Y Shen - Knowledge-Based Systems, 2024 - Elsevier
Abstract Multi-Label Text Classification (MLTC) is an important but challenging task in the
field of natural language processing. In this paper, we propose a novel method, Semantic …