Self-supervised learning of graph neural networks: A unified review

Y **e, Z Xu, J Zhang, Z Wang… - IEEE transactions on …, 2022‏ - ieeexplore.ieee.org
Deep models trained in supervised mode have achieved remarkable success on a variety of
tasks. When labeled samples are limited, self-supervised learning (SSL) is emerging as a …

A survey of graph neural networks in various learning paradigms: methods, applications, and challenges

L Waikhom, R Patgiri - Artificial Intelligence Review, 2023‏ - Springer
In the last decade, deep learning has reinvigorated the machine learning field. It has solved
many problems in computer vision, speech recognition, natural language processing, and …

Graphmae: Self-supervised masked graph autoencoders

Z Hou, X Liu, Y Cen, Y Dong, H Yang, C Wang… - Proceedings of the 28th …, 2022‏ - dl.acm.org
Self-supervised learning (SSL) has been extensively explored in recent years. Particularly,
generative SSL has seen emerging success in natural language processing and other …

Graph self-supervised learning: A survey

Y Liu, M **, S Pan, C Zhou, Y Zheng… - IEEE transactions on …, 2022‏ - ieeexplore.ieee.org
Deep learning on graphs has attracted significant interests recently. However, most of the
works have focused on (semi-) supervised learning, resulting in shortcomings including …

Graphmae2: A decoding-enhanced masked self-supervised graph learner

Z Hou, Y He, Y Cen, X Liu, Y Dong… - Proceedings of the …, 2023‏ - dl.acm.org
Graph self-supervised learning (SSL), including contrastive and generative approaches,
offers great potential to address the fundamental challenge of label scarcity in real-world …

MOGONET integrates multi-omics data using graph convolutional networks allowing patient classification and biomarker identification

T Wang, W Shao, Z Huang, H Tang, J Zhang… - Nature …, 2021‏ - nature.com
To fully utilize the advances in omics technologies and achieve a more comprehensive
understanding of human diseases, novel computational methods are required for integrative …

S2gae: Self-supervised graph autoencoders are generalizable learners with graph masking

Q Tan, N Liu, X Huang, SH Choi, L Li, R Chen… - Proceedings of the …, 2023‏ - dl.acm.org
Self-supervised learning (SSL) has been demonstrated to be effective in pre-training models
that can be generalized to various downstream tasks. Graph Autoencoder (GAE), an …

Contrastive multi-view representation learning on graphs

K Hassani, AH Khasahmadi - International conference on …, 2020‏ - proceedings.mlr.press
We introduce a self-supervised approach for learning node and graph level representations
by contrasting structural views of graphs. We show that unlike visual representation learning …

Multi-view contrastive graph clustering

E Pan, Z Kang - Advances in neural information processing …, 2021‏ - proceedings.neurips.cc
With the explosive growth of information technology, multi-view graph data have become
increasingly prevalent and valuable. Most existing multi-view clustering techniques either …

Deep graph clustering via dual correlation reduction

Y Liu, W Tu, S Zhou, X Liu, L Song, X Yang… - Proceedings of the AAAI …, 2022‏ - ojs.aaai.org
Deep graph clustering, which aims to reveal the underlying graph structure and divide the
nodes into different groups, has attracted intensive attention in recent years. However, we …