[HTML][HTML] Data augmentation: A comprehensive survey of modern approaches
A Mumuni, F Mumuni - Array, 2022 - Elsevier
To ensure good performance, modern machine learning models typically require large
amounts of quality annotated data. Meanwhile, the data collection and annotation processes …
amounts of quality annotated data. Meanwhile, the data collection and annotation processes …
Knowledge distillation on graphs: A survey
Graph Neural Networks (GNNs) have received significant attention for demonstrating their
capability to handle graph data. However, they are difficult to be deployed in resource …
capability to handle graph data. However, they are difficult to be deployed in resource …
A comprehensive study on large-scale graph training: Benchmarking and rethinking
Large-scale graph training is a notoriously challenging problem for graph neural networks
(GNNs). Due to the nature of evolving graph structures into the training process, vanilla …
(GNNs). Due to the nature of evolving graph structures into the training process, vanilla …
Linkless link prediction via relational distillation
Abstract Graph Neural Networks (GNNs) have shown exceptional performance in the task of
link prediction. Despite their effectiveness, the high latency brought by non-trivial …
link prediction. Despite their effectiveness, the high latency brought by non-trivial …
GRAPHPATCHER: mitigating degree bias for graph neural networks via test-time augmentation
Recent studies have shown that graph neural networks (GNNs) exhibit strong biases
towards the node degree: they usually perform satisfactorily on high-degree nodes with rich …
towards the node degree: they usually perform satisfactorily on high-degree nodes with rich …
Learning mlps on graphs: A unified view of effectiveness, robustness, and efficiency
While Graph Neural Networks (GNNs) have demonstrated their efficacy in dealing with non-
Euclidean structural data, they are difficult to be deployed in real applications due to the …
Euclidean structural data, they are difficult to be deployed in real applications due to the …
Bag of tricks for training deeper graph neural networks: A comprehensive benchmark study
Training deep graph neural networks (GNNs) is notoriously hard. Besides the standard
plights in training deep architectures such as vanishing gradients and overfitting, it also …
plights in training deep architectures such as vanishing gradients and overfitting, it also …
Graph collaborative signals denoising and augmentation for recommendation
Graph collaborative filtering (GCF) is a popular technique for capturing high-order
collaborative signals in recommendation systems. However, GCF's bipartite adjacency …
collaborative signals in recommendation systems. However, GCF's bipartite adjacency …
Multi-task item-attribute graph pre-training for strict cold-start item recommendation
Recommendation systems suffer in the strict cold-start (SCS) scenario, where the user-item
interactions are entirely unavailable. The well-established, dominating identity (ID)-based …
interactions are entirely unavailable. The well-established, dominating identity (ID)-based …
From trainable negative depth to edge heterophily in graphs
Finding the proper depth $ d $ of a graph convolutional network (GCN) that provides strong
representation ability has drawn significant attention, yet nonetheless largely remains an …
representation ability has drawn significant attention, yet nonetheless largely remains an …