[HTML][HTML] Data augmentation: A comprehensive survey of modern approaches

A Mumuni, F Mumuni - Array, 2022 - Elsevier
To ensure good performance, modern machine learning models typically require large
amounts of quality annotated data. Meanwhile, the data collection and annotation processes …

Knowledge distillation on graphs: A survey

Y Tian, S Pei, X Zhang, C Zhang, N Chawla - ACM Computing Surveys, 2023 - dl.acm.org
Graph Neural Networks (GNNs) have received significant attention for demonstrating their
capability to handle graph data. However, they are difficult to be deployed in resource …

A comprehensive study on large-scale graph training: Benchmarking and rethinking

K Duan, Z Liu, P Wang, W Zheng… - Advances in …, 2022 - proceedings.neurips.cc
Large-scale graph training is a notoriously challenging problem for graph neural networks
(GNNs). Due to the nature of evolving graph structures into the training process, vanilla …

GRAPHPATCHER: mitigating degree bias for graph neural networks via test-time augmentation

M Ju, T Zhao, W Yu, N Shah… - Advances in Neural …, 2024 - proceedings.neurips.cc
Recent studies have shown that graph neural networks (GNNs) exhibit strong biases
towards the node degree: they usually perform satisfactorily on high-degree nodes with rich …

Linkless link prediction via relational distillation

Z Guo, W Shiao, S Zhang, Y Liu… - International …, 2023 - proceedings.mlr.press
Abstract Graph Neural Networks (GNNs) have shown exceptional performance in the task of
link prediction. Despite their effectiveness, the high latency brought by non-trivial …

Learning mlps on graphs: A unified view of effectiveness, robustness, and efficiency

Y Tian, C Zhang, Z Guo, X Zhang… - … Conference on Learning …, 2022 - openreview.net
While Graph Neural Networks (GNNs) have demonstrated their efficacy in dealing with non-
Euclidean structural data, they are difficult to be deployed in real applications due to the …

Bag of tricks for training deeper graph neural networks: A comprehensive benchmark study

T Chen, K Zhou, K Duan, W Zheng… - … on Pattern Analysis …, 2022 - ieeexplore.ieee.org
Training deep graph neural networks (GNNs) is notoriously hard. Besides the standard
plights in training deep architectures such as vanishing gradients and overfitting, it also …

Graph collaborative signals denoising and augmentation for recommendation

Z Fan, K Xu, Z Dong, H Peng, J Zhang… - Proceedings of the 46th …, 2023 - dl.acm.org
Graph collaborative filtering (GCF) is a popular technique for capturing high-order
collaborative signals in recommendation systems. However, GCF's bipartite adjacency …

Multi-task item-attribute graph pre-training for strict cold-start item recommendation

Y Cao, L Yang, C Wang, Z Liu, H Peng, C You… - Proceedings of the 17th …, 2023 - dl.acm.org
Recommendation systems suffer in the strict cold-start (SCS) scenario, where the user-item
interactions are entirely unavailable. The well-established, dominating identity (ID)-based …

From trainable negative depth to edge heterophily in graphs

Y Yan, Y Chen, H Chen, M Xu, M Das… - Advances in …, 2024 - proceedings.neurips.cc
Finding the proper depth $ d $ of a graph convolutional network (GCN) that provides strong
representation ability has drawn significant attention, yet nonetheless largely remains an …