Cluster-guided contrastive graph clustering network

X Yang, Y Liu, S Zhou, S Wang, W Tu… - Proceedings of the …, 2023 - ojs.aaai.org
Benefiting from the intrinsic supervision information exploitation capability, contrastive
learning has achieved promising performance in the field of deep graph clustering recently …

Simteg: A frustratingly simple approach improves textual graph learning

K Duan, Q Liu, TS Chua, S Yan, WT Ooi, Q **e… - arxiv preprint arxiv …, 2023 - arxiv.org
Textual graphs (TGs) are graphs whose nodes correspond to text (sentences or documents),
which are widely prevalent. The representation learning of TGs involves two stages:(i) …

Exgc: Bridging efficiency and explainability in graph condensation

J Fang, X Li, Y Sui, Y Gao, G Zhang, K Wang… - Proceedings of the …, 2024 - dl.acm.org
Graph representation learning on vast datasets, like web data, has made significant strides.
However, the associated computational and storage overheads raise concerns. In sight of …

A comprehensive survey on graph summarization with graph neural networks

N Shabani, J Wu, A Beheshti, QZ Sheng… - IEEE Transactions …, 2024 - ieeexplore.ieee.org
As large-scale graphs become more widespread, more and more computational challenges
with extracting, processing, and interpreting large graph data are being exposed. It is …

Idea: A flexible framework of certified unlearning for graph neural networks

Y Dong, B Zhang, Z Lei, N Zou, J Li - Proceedings of the 30th ACM …, 2024 - dl.acm.org
Graph Neural Networks (GNNs) have been increasingly deployed in a plethora of
applications. However, the graph data used for training may contain sensitive personal …

Rsc: accelerate graph neural networks training via randomized sparse computations

Z Liu, C Shengyuan, K Zhou, D Zha… - International …, 2023 - proceedings.mlr.press
Training graph neural networks (GNNs) is extremely time consuming because sparse graph-
based operations are hard to be accelerated by community hardware. Prior art successfully …

Submix: Learning to mix graph sampling heuristics

S Abu-El-Haija, JV Dillon, B Fatemi… - Uncertainty in …, 2023 - proceedings.mlr.press
Sampling subgraphs for training Graph Neural Networks (GNNs) is receiving much attention
from the GNN community. While a variety of methods have been proposed, each method …

Old can be gold: Better gradient flow can make vanilla-gcns great again

A Jaiswal, P Wang, T Chen… - Advances in …, 2022 - proceedings.neurips.cc
Despite the enormous success of Graph Convolutional Networks (GCNs) in modeling graph-
structured data, most of the current GCNs are shallow due to the notoriously challenging …

Linear-Time Graph Neural Networks for Scalable Recommendations

J Zhang, R Xue, W Fan, X Xu, Q Li, J Pei… - Proceedings of the ACM …, 2024 - dl.acm.org
In an era of information explosion, recommender systems are vital tools to deliver
personalized recommendations for users. The key of recommender systems is to forecast …

Foundation models for the electric power grid

HF Hamann, B Gjorgiev, T Brunschwiler, LSA Martins… - Joule, 2024 - cell.com
Foundation models (FMs) currently dominate news headlines. They employ advanced deep
learning architectures to extract structural information autonomously from vast datasets …