A comprehensive survey on pretrained foundation models: A history from bert to chatgpt

C Zhou, Q Li, C Li, J Yu, Y Liu, G Wang… - International Journal of …, 2024 - Springer
Abstract Pretrained Foundation Models (PFMs) are regarded as the foundation for various
downstream tasks across different data modalities. A PFM (eg, BERT, ChatGPT, GPT-4) is …

Graph prompt learning: A comprehensive survey and beyond

X Sun, J Zhang, X Wu, H Cheng, Y **ong… - arxiv preprint arxiv …, 2023 - arxiv.org
Artificial General Intelligence (AGI) has revolutionized numerous fields, yet its integration
with graph data, a cornerstone in our interconnected world, remains nascent. This paper …

Scaling language-image pre-training via masking

Y Li, H Fan, R Hu… - Proceedings of the …, 2023 - openaccess.thecvf.com
Abstract We present Fast Language-Image Pre-training (FLIP), a simple and more efficient
method for training CLIP. Our method randomly masks out and removes a large portion of …

Graphgpt: Graph instruction tuning for large language models

J Tang, Y Yang, W Wei, L Shi, L Su, S Cheng… - Proceedings of the 47th …, 2024 - dl.acm.org
Graph Neural Networks (GNNs) have evolved to understand graph structures through
recursive exchanges and aggregations among nodes. To enhance robustness, self …

Data augmentation for deep graph learning: A survey

K Ding, Z Xu, H Tong, H Liu - ACM SIGKDD Explorations Newsletter, 2022 - dl.acm.org
Graph neural networks, a powerful deep learning tool to model graph-structured data, have
demonstrated remarkable performance on numerous graph learning tasks. To address the …

S2gae: Self-supervised graph autoencoders are generalizable learners with graph masking

Q Tan, N Liu, X Huang, SH Choi, L Li, R Chen… - Proceedings of the …, 2023 - dl.acm.org
Self-supervised learning (SSL) has been demonstrated to be effective in pre-training models
that can be generalized to various downstream tasks. Graph Autoencoder (GAE), an …

Skeletonmae: graph-based masked autoencoder for skeleton sequence pre-training

H Yan, Y Liu, Y Wei, Z Li, G Li… - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
Skeleton sequence representation learning has shown great advantages for action
recognition due to its promising ability to model human joints and topology. However, the …

All in one: Multi-task prompting for graph neural networks

X Sun, H Cheng, J Li, B Liu, J Guan - Proceedings of the 29th ACM …, 2023 - dl.acm.org
Recently," pre-training and fine-tuning''has been adopted as a standard workflow for many
graph tasks since it can take general graph knowledge to relieve the lack of graph …

Graphmae2: A decoding-enhanced masked self-supervised graph learner

Z Hou, Y He, Y Cen, X Liu, Y Dong… - Proceedings of the …, 2023 - dl.acm.org
Graph self-supervised learning (SSL), including contrastive and generative approaches,
offers great potential to address the fundamental challenge of label scarcity in real-world …

Representation learning with large language models for recommendation

X Ren, W Wei, L **a, L Su, S Cheng, J Wang… - Proceedings of the …, 2024 - dl.acm.org
Recommender systems have seen significant advancements with the influence of deep
learning and graph neural networks, particularly in capturing complex user-item …