A comprehensive survey on pretrained foundation models: A history from bert to chatgpt
Abstract Pretrained Foundation Models (PFMs) are regarded as the foundation for various
downstream tasks across different data modalities. A PFM (eg, BERT, ChatGPT, GPT-4) is …
downstream tasks across different data modalities. A PFM (eg, BERT, ChatGPT, GPT-4) is …
Graph prompt learning: A comprehensive survey and beyond
Artificial General Intelligence (AGI) has revolutionized numerous fields, yet its integration
with graph data, a cornerstone in our interconnected world, remains nascent. This paper …
with graph data, a cornerstone in our interconnected world, remains nascent. This paper …
Scaling language-image pre-training via masking
Abstract We present Fast Language-Image Pre-training (FLIP), a simple and more efficient
method for training CLIP. Our method randomly masks out and removes a large portion of …
method for training CLIP. Our method randomly masks out and removes a large portion of …
Graphgpt: Graph instruction tuning for large language models
Graph Neural Networks (GNNs) have evolved to understand graph structures through
recursive exchanges and aggregations among nodes. To enhance robustness, self …
recursive exchanges and aggregations among nodes. To enhance robustness, self …
Data augmentation for deep graph learning: A survey
Graph neural networks, a powerful deep learning tool to model graph-structured data, have
demonstrated remarkable performance on numerous graph learning tasks. To address the …
demonstrated remarkable performance on numerous graph learning tasks. To address the …
S2gae: Self-supervised graph autoencoders are generalizable learners with graph masking
Self-supervised learning (SSL) has been demonstrated to be effective in pre-training models
that can be generalized to various downstream tasks. Graph Autoencoder (GAE), an …
that can be generalized to various downstream tasks. Graph Autoencoder (GAE), an …
Skeletonmae: graph-based masked autoencoder for skeleton sequence pre-training
Skeleton sequence representation learning has shown great advantages for action
recognition due to its promising ability to model human joints and topology. However, the …
recognition due to its promising ability to model human joints and topology. However, the …
All in one: Multi-task prompting for graph neural networks
Recently," pre-training and fine-tuning''has been adopted as a standard workflow for many
graph tasks since it can take general graph knowledge to relieve the lack of graph …
graph tasks since it can take general graph knowledge to relieve the lack of graph …
Graphmae2: A decoding-enhanced masked self-supervised graph learner
Graph self-supervised learning (SSL), including contrastive and generative approaches,
offers great potential to address the fundamental challenge of label scarcity in real-world …
offers great potential to address the fundamental challenge of label scarcity in real-world …
Representation learning with large language models for recommendation
Recommender systems have seen significant advancements with the influence of deep
learning and graph neural networks, particularly in capturing complex user-item …
learning and graph neural networks, particularly in capturing complex user-item …