Distributed graph neural network training: A survey

Y Shao, H Li, X Gu, H Yin, Y Li, X Miao… - ACM Computing …, 2024 - dl.acm.org
Graph neural networks (GNNs) are a type of deep learning models that are trained on
graphs and have been successfully applied in various domains. Despite the effectiveness of …

A comprehensive survey of dynamic graph neural networks: Models, frameworks, benchmarks, experiments and challenges

ZZ Feng, R Wang, TX Wang, M Song, S Wu… - arxiv preprint arxiv …, 2024 - arxiv.org
Dynamic Graph Neural Networks (GNNs) combine temporal information with GNNs to
capture structural, temporal, and contextual relationships in dynamic graphs simultaneously …

Bns-gcn: Efficient full-graph training of graph convolutional networks with partition-parallelism and random boundary node sampling

C Wan, Y Li, A Li, NS Kim, Y Lin - Proceedings of Machine …, 2022 - proceedings.mlsys.org
Abstract Graph Convolutional Networks (GCNs) have emerged as the state-of-the-art
method for graph-based learning tasks. However, training GCNs at scale is still challenging …

Scalable and efficient full-graph gnn training for large graphs

X Wan, K Xu, X Liao, Y **, K Chen, X ** - Proceedings of the ACM on …, 2023 - dl.acm.org
Graph Neural Networks (GNNs) have emerged as powerful tools to capture structural
information from graph-structured data, achieving state-of-the-art performance on …

Parallel and distributed graph neural networks: An in-depth concurrency analysis

M Besta, T Hoefler - IEEE Transactions on Pattern Analysis and …, 2024 - ieeexplore.ieee.org
Graph neural networks (GNNs) are among the most powerful tools in deep learning. They
routinely solve complex problems on unstructured networks, such as node classification …

GraphFM: Improving large-scale GNN training via feature momentum

H Yu, L Wang, B Wang, M Liu… - … on machine learning, 2022 - proceedings.mlr.press
Training of graph neural networks (GNNs) for large-scale node classification is challenging.
A key difficulty lies in obtaining accurate hidden node representations while avoiding the …

EXACT: Scalable graph neural networks training via extreme activation compression

Z Liu, K Zhou, F Yang, L Li, R Chen… - … Conference on Learning …, 2021 - openreview.net
Training Graph Neural Networks (GNNs) on large graphs is a fundamental challenge due to
the high memory usage, which is mainly occupied by activations (eg, node embeddings) …

A survey on graph neural network acceleration: Algorithms, systems, and customized hardware

S Zhang, A Sohrabizadeh, C Wan, Z Huang… - arxiv preprint arxiv …, 2023 - arxiv.org
Graph neural networks (GNNs) are emerging for machine learning research on graph-
structured data. GNNs achieve state-of-the-art performance on many tasks, but they face …

Adaptive message quantization and parallelization for distributed full-graph gnn training

B Wan, J Zhao, C Wu - Proceedings of Machine Learning …, 2023 - proceedings.mlsys.org
Distributed full-graph training of Graph Neural Networks (GNNs) over large graphs is
bandwidth-demanding and time-consuming. Frequent exchanges of node features …

Optimus-cc: Efficient large nlp model training with 3d parallelism aware communication compression

J Song, J Yim, J Jung, H Jang, HJ Kim, Y Kim… - Proceedings of the 28th …, 2023 - dl.acm.org
In training of modern large natural language processing (NLP) models, it has become a
common practice to split models using 3D parallelism to multiple GPUs. Such technique …