A comprehensive survey on pretrained foundation models: A history from bert to chatgpt

C Zhou, Q Li, C Li, J Yu, Y Liu, G Wang… - International Journal of …, 2024 - Springer
Abstract Pretrained Foundation Models (PFMs) are regarded as the foundation for various
downstream tasks across different data modalities. A PFM (eg, BERT, ChatGPT, GPT-4) is …

A comprehensive survey on deep graph representation learning

W Ju, Z Fang, Y Gu, Z Liu, Q Long, Z Qiao, Y Qin… - Neural Networks, 2024 - Elsevier
Graph representation learning aims to effectively encode high-dimensional sparse graph-
structured data into low-dimensional dense vectors, which is a fundamental task that has …

Impact of code language models on automated program repair

N Jiang, K Liu, T Lutellier, L Tan - 2023 IEEE/ACM 45th …, 2023 - ieeexplore.ieee.org
Automated program repair (APR) aims to help developers improve software reliability by
generating patches for buggy programs. Although many code language models (CLM) are …

Talk like a graph: Encoding graphs for large language models

B Fatemi, J Halcrow, B Perozzi - arxiv preprint arxiv:2310.04560, 2023 - arxiv.org
Graphs are a powerful tool for representing and analyzing complex relationships in real-
world applications such as social networks, recommender systems, and computational …

Sgformer: Simplifying and empowering transformers for large-graph representations

Q Wu, W Zhao, C Yang, H Zhang… - Advances in …, 2023 - proceedings.neurips.cc
Learning representations on large-sized graphs is a long-standing challenge due to the inter-
dependence nature involved in massive data points. Transformers, as an emerging class of …

Exphormer: Sparse transformers for graphs

H Shirzad, A Velingker… - International …, 2023 - proceedings.mlr.press
Graph transformers have emerged as a promising architecture for a variety of graph learning
and representation tasks. Despite their successes, though, it remains challenging to scale …

Structure-aware transformer for graph representation learning

D Chen, L O'Bray, K Borgwardt - … conference on machine …, 2022 - proceedings.mlr.press
The Transformer architecture has gained growing attention in graph representation learning
recently, as it naturally overcomes several limitations of graph neural networks (GNNs) by …

Do transformers really perform badly for graph representation?

C Ying, T Cai, S Luo, S Zheng, G Ke… - Advances in neural …, 2021 - proceedings.neurips.cc
The Transformer architecture has become a dominant choice in many domains, such as
natural language processing and computer vision. Yet, it has not achieved competitive …

Data augmentation for deep graph learning: A survey

K Ding, Z Xu, H Tong, H Liu - ACM SIGKDD Explorations Newsletter, 2022 - dl.acm.org
Graph neural networks, a powerful deep learning tool to model graph-structured data, have
demonstrated remarkable performance on numerous graph learning tasks. To address the …

Representing long-range context for graph neural networks with global attention

Z Wu, P Jain, M Wright, A Mirhoseini… - Advances in neural …, 2021 - proceedings.neurips.cc
Graph neural networks are powerful architectures for structured datasets. However, current
methods struggle to represent long-range dependencies. Scaling the depth or width of …