A comprehensive survey on pretrained foundation models: A history from bert to chatgpt

C Zhou, Q Li, C Li, J Yu, Y Liu, G Wang… - International Journal of …, 2024 - Springer
Abstract Pretrained Foundation Models (PFMs) are regarded as the foundation for various
downstream tasks across different data modalities. A PFM (eg, BERT, ChatGPT, GPT-4) is …

A comprehensive survey on deep graph representation learning

W Ju, Z Fang, Y Gu, Z Liu, Q Long, Z Qiao, Y Qin… - Neural Networks, 2024 - Elsevier
Graph representation learning aims to effectively encode high-dimensional sparse graph-
structured data into low-dimensional dense vectors, which is a fundamental task that has …

Graphprompt: Unifying pre-training and downstream tasks for graph neural networks

Z Liu, X Yu, Y Fang, X Zhang - Proceedings of the ACM web conference …, 2023 - dl.acm.org
Graphs can model complex relationships between objects, enabling a myriad of Web
applications such as online page/article classification and social recommendation. While …

Graphmae: Self-supervised masked graph autoencoders

Z Hou, X Liu, Y Cen, Y Dong, H Yang, C Wang… - Proceedings of the 28th …, 2022 - dl.acm.org
Self-supervised learning (SSL) has been extensively explored in recent years. Particularly,
generative SSL has seen emerging success in natural language processing and other …

Hypergraph contrastive collaborative filtering

L **a, C Huang, Y Xu, J Zhao, D Yin… - Proceedings of the 45th …, 2022 - dl.acm.org
Collaborative Filtering (CF) has emerged as fundamental paradigms for parameterizing
users and items into latent representation space, with their correlative patterns from …

Mind the gap: Understanding the modality gap in multi-modal contrastive representation learning

VW Liang, Y Zhang, Y Kwon… - Advances in Neural …, 2022 - proceedings.neurips.cc
We present modality gap, an intriguing geometric phenomenon of the representation space
of multi-modal models. Specifically, we show that different data modalities (eg images and …

Self-supervised learning for recommender systems: A survey

J Yu, H Yin, X **a, T Chen, J Li… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
In recent years, neural architecture-based recommender systems have achieved
tremendous success, but they still fall short of expectation when dealing with highly sparse …

Graph neural networks: foundation, frontiers and applications

L Wu, P Cui, J Pei, L Zhao, X Guo - … of the 28th ACM SIGKDD conference …, 2022 - dl.acm.org
The field of graph neural networks (GNNs) has seen rapid and incredible strides over the
recent years. Graph neural networks, also known as deep learning on graphs, graph …

Gppt: Graph pre-training and prompt tuning to generalize graph neural networks

M Sun, K Zhou, X He, Y Wang, X Wang - Proceedings of the 28th ACM …, 2022 - dl.acm.org
Despite the promising representation learning of graph neural networks (GNNs), the
supervised training of GNNs notoriously requires large amounts of labeled data from each …

Data augmentation for deep graph learning: A survey

K Ding, Z Xu, H Tong, H Liu - ACM SIGKDD Explorations Newsletter, 2022 - dl.acm.org
Graph neural networks, a powerful deep learning tool to model graph-structured data, have
demonstrated remarkable performance on numerous graph learning tasks. To address the …