Distributed graph neural network training: A survey

Y Shao, H Li, X Gu, H Yin, Y Li, X Miao… - ACM Computing …, 2024 - dl.acm.org
Graph neural networks (GNNs) are a type of deep learning models that are trained on
graphs and have been successfully applied in various domains. Despite the effectiveness of …

Knowledge distillation on graphs: A survey

Y Tian, S Pei, X Zhang, C Zhang, N Chawla - ACM Computing Surveys, 2023 - dl.acm.org
Graph Neural Networks (GNNs) have received significant attention for demonstrating their
capability to handle graph data. However, they are difficult to be deployed in resource …

Knowledge distillation: A survey

J Gou, B Yu, SJ Maybank, D Tao - International Journal of Computer Vision, 2021 - Springer
In recent years, deep neural networks have been successful in both industry and academia,
especially for computer vision tasks. The great success of deep learning is mainly due to its …

Linkless link prediction via relational distillation

Z Guo, W Shiao, S Zhang, Y Liu… - International …, 2023 - proceedings.mlr.press
Abstract Graph Neural Networks (GNNs) have shown exceptional performance in the task of
link prediction. Despite their effectiveness, the high latency brought by non-trivial …

Extract the knowledge of graph neural networks and go beyond it: An effective knowledge distillation framework

C Yang, J Liu, C Shi - Proceedings of the web conference 2021, 2021 - dl.acm.org
Semi-supervised learning on graphs is an important problem in the machine learning area.
In recent years, state-of-the-art classification methods based on graph neural networks …

Graph attention multi-layer perceptron

W Zhang, Z Yin, Z Sheng, Y Li, W Ouyang, X Li… - Proceedings of the 28th …, 2022 - dl.acm.org
Graph neural networks (GNNs) have achieved great success in many graph-based
applications. However, the enormous size and high sparsity level of graphs hinder their …

Knowledge distillation improves graph structure augmentation for graph neural networks

L Wu, H Lin, Y Huang, SZ Li - Advances in Neural …, 2022 - proceedings.neurips.cc
Graph (structure) augmentation aims to perturb the graph structure through heuristic or
probabilistic rules, enabling the nodes to capture richer contextual information and thus …

Sancus: staleness-aware communication-avoiding full-graph decentralized training in large-scale graph neural networks

J Peng, Z Chen, Y Shao, Y Shen, L Chen… - Proceedings of the VLDB …, 2022 - dl.acm.org
Graph neural networks (GNNs) have emerged due to their success at modeling graph data.
Yet, it is challenging for GNNs to efficiently scale to large graphs. Thus, distributed GNNs …

Quantifying the knowledge in gnns for reliable distillation into mlps

L Wu, H Lin, Y Huang, SZ Li - International Conference on …, 2023 - proceedings.mlr.press
To bridge the gaps between topology-aware Graph Neural Networks (GNNs) and inference-
efficient Multi-Layer Perceptron (MLPs), GLNN proposes to distill knowledge from a well …

Extracting low-/high-frequency knowledge from graph neural networks and injecting it into mlps: An effective gnn-to-mlp distillation framework

L Wu, H Lin, Y Huang, T Fan, SZ Li - … of the AAAI Conference on Artificial …, 2023 - ojs.aaai.org
Recent years have witnessed the great success of Graph Neural Networks (GNNs) in
handling graph-related tasks. However, MLPs remain the primary workhorse for practical …