Computing graph neural networks: A survey from algorithms to accelerators

S Abadal, A Jain, R Guirado, J López-Alonso… - ACM Computing …, 2021 - dl.acm.org
Graph Neural Networks (GNNs) have exploded onto the machine learning scene in recent
years owing to their capability to model and learn from graph-structured data. Such an ability …

Demystifying graph databases: Analysis and taxonomy of data organization, system designs, and graph queries

M Besta, R Gerstenberger, E Peter, M Fischer… - ACM Computing …, 2023 - dl.acm.org
Numerous irregular graph datasets, for example social networks or web graphs, may contain
even trillions of edges. Often, their structure changes over time and they have domain …

Gnnautoscale: Scalable and expressive graph neural networks via historical embeddings

M Fey, JE Lenssen, F Weichert… - … on machine learning, 2021 - proceedings.mlr.press
We present GNNAutoScale (GAS), a framework for scaling arbitrary message-passing GNNs
to large graphs. GAS prunes entire sub-trees of the computation graph by utilizing historical …

Pagraph: Scaling gnn training on large graphs via computation-aware caching

Z Lin, C Li, Y Miao, Y Liu, Y Xu - … of the 11th ACM Symposium on Cloud …, 2020 - dl.acm.org
Emerging graph neural networks (GNNs) have extended the successes of deep learning
techniques against datasets like images and texts to more complex graph-structured data …

Distgnn: Scalable distributed training for large-scale graph neural networks

V Md, S Misra, G Ma, R Mohanty, E Georganas… - Proceedings of the …, 2021 - dl.acm.org
Full-batch training on Graph Neural Networks (GNN) to learn the structure of large graphs is
a critical problem that needs to scale to hundreds of compute nodes to be feasible. It is …

Sancus: staleness-aware communication-avoiding full-graph decentralized training in large-scale graph neural networks

J Peng, Z Chen, Y Shao, Y Shen, L Chen… - Proceedings of the VLDB …, 2022 - dl.acm.org
Graph neural networks (GNNs) have emerged due to their success at modeling graph data.
Yet, it is challenging for GNNs to efficiently scale to large graphs. Thus, distributed GNNs …

Bns-gcn: Efficient full-graph training of graph convolutional networks with partition-parallelism and random boundary node sampling

C Wan, Y Li, A Li, NS Kim, Y Lin - Proceedings of Machine …, 2022 - proceedings.mlsys.org
Abstract Graph Convolutional Networks (GCNs) have emerged as the state-of-the-art
method for graph-based learning tasks. However, training GCNs at scale is still challenging …

DistDGL: Distributed graph neural network training for billion-scale graphs

D Zheng, C Ma, M Wang, J Zhou, Q Su… - 2020 IEEE/ACM 10th …, 2020 - ieeexplore.ieee.org
Graph neural networks (GNN) have shown great success in learning from graph-structured
data. They are widely used in various applications, such as recommendation, fraud …

Distributed graph neural network training: A survey

Y Shao, H Li, X Gu, H Yin, Y Li, X Miao… - ACM Computing …, 2024 - dl.acm.org
Graph neural networks (GNNs) are a type of deep learning models that are trained on
graphs and have been successfully applied in various domains. Despite the effectiveness of …

PipeGCN: Efficient full-graph training of graph convolutional networks with pipelined feature communication

C Wan, Y Li, CR Wolfe, A Kyrillidis, NS Kim… - arxiv preprint arxiv …, 2022 - arxiv.org
Graph Convolutional Networks (GCNs) is the state-of-the-art method for learning graph-
structured data, and training large-scale GCNs requires distributed training across multiple …