A comprehensive survey on graph neural network accelerators

J Liu, S Chen, L Shen - Frontiers of Computer Science, 2025 - Springer
Deep learning has gained superior accuracy on Euclidean structure data in neural networks.
As a result, non-Euclidean structure data, such as graph data, has more sophisticated …

GraphA: An efficient ReRAM-based architecture to accelerate large scale graph processing

SA Ghasemi, B Jahannia, H Farbeh - Journal of Systems Architecture, 2022 - Elsevier
Graph analytics is the basis for many modern applications, eg, machine learning and
streaming data problems. With an unprecedented increase in data size of many emerging …

Relhd: A graph-based learning on fefet with hyperdimensional computing

J Kang, M Zhou, A Bhansali, W Xu… - 2022 IEEE 40th …, 2022 - ieeexplore.ieee.org
Advances in graph neural network (GNN)-based algorithms enable machine learning on
relational data. GNNs are computationally demanding since they rely upon backpropagation …

Sparse attention acceleration with synergistic in-memory pruning and on-chip recomputation

A Yazdanbakhsh, A Moradifirouzabadi… - 2022 55th IEEE/ACM …, 2022 - ieeexplore.ieee.org
As its core computation, a self-attention mechanism gauges pairwise correlations across the
entire input sequence. Despite favorable performance, calculating pairwise correlations is …

Imga: Efficient in-memory graph convolution network aggregation with data flow optimizations

Y Wei, X Wang, S Zhang, J Yang, X Jia… - … on Computer-Aided …, 2023 - ieeexplore.ieee.org
Aggregating features from neighbor vertices is a fundamental operation in graph convolution
network (GCN). However, the sparsity in graph data creates poor spatial and temporal …

ReAIM: A ReRAM-based Adaptive Ising Machine for Solving Combinatorial Optimization Problems

HW Chiang, CF Nien, HY Cheng… - 2024 ACM/IEEE 51st …, 2024 - ieeexplore.ieee.org
Recently, in light of the success of quantum computers, research teams have actively
developed quantum-inspired computers using classical computing technology. One notable …

GCIM: Towards Efficient Processing of Graph Convolutional Networks in 3D-Stacked Memory

J Chen, Y Lin, K Sun, J Chen, C Ma… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
Graph convolutional networks (GCNs) have become a powerful deep learning approach for
graph-structured data. Different from traditional neural networks such as convolutional …

PASGCN: An ReRAM-based PIM design for GCN with adaptively sparsified graphs

T Yang, D Li, F Ma, Z Song, Y Zhao… - … on Computer-Aided …, 2022 - ieeexplore.ieee.org
Graph convolutional network (GCN) is a promising but computing-and memory-intensive
learning model. Processing-in-memory (PIM) architecture based on the resistive random …

GAS: General-Purpose In-Memory-Computing Accelerator for Sparse Matrix Multiplication

X Zhang, Z Li, R Liu, X Chen… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Sparse matrix multiplication is widely used in various practical applications. Different
accelerators have been proposed to speed up sparse matrix-dense vector multiplication …

Dcim-gcn: Digital computing-in-memory to efficiently accelerate graph convolutional networks

Y Qiu, Y Ma, W Zhao, M Wu, L Ye… - Proceedings of the 41st …, 2022 - dl.acm.org
Computing-in-memory (CIM) is emerging as a promising architecture to accelerate graph
convolutional networks (GCNs) normally bounded by redundant and irregular memory …