The evolution of distributed systems for graph neural networks and their origin in graph processing and deep learning: A survey

J Vatter, R Mayer, HA Jacobsen - ACM Computing Surveys, 2023 - dl.acm.org
Graph neural networks (GNNs) are an emerging research field. This specialized deep
neural network architecture is capable of processing graph structured data and bridges the …

Large language models on graphs: A comprehensive survey

B **, G Liu, C Han, M Jiang, H Ji… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Large language models (LLMs), such as GPT4 and LLaMA, are creating significant
advancements in natural language processing, due to their strong text encoding/decoding …

Exploring the potential of large language models (llms) in learning on graphs

Z Chen, H Mao, H Li, W **, H Wen, X Wei… - ACM SIGKDD …, 2024 - dl.acm.org
Learning on Graphs has attracted immense attention due to its wide real-world applications.
The most popular pipeline for learning on graphs with textual node attributes primarily relies …

A survey of machine unlearning

TT Nguyen, TT Huynh, Z Ren, PL Nguyen… - arxiv preprint arxiv …, 2022 - arxiv.org
Today, computer systems hold large amounts of personal data. Yet while such an
abundance of data allows breakthroughs in artificial intelligence, and especially machine …

Vision gnn: An image is worth graph of nodes

K Han, Y Wang, J Guo, Y Tang… - Advances in neural …, 2022 - proceedings.neurips.cc
Network architecture plays a key role in the deep learning-based computer vision system.
The widely-used convolutional neural network and transformer treat the image as a grid or …

Sgformer: Simplifying and empowering transformers for large-graph representations

Q Wu, W Zhao, C Yang, H Zhang… - Advances in …, 2023 - proceedings.neurips.cc
Learning representations on large-sized graphs is a long-standing challenge due to the inter-
dependence nature involved in massive data points. Transformers, as an emerging class of …

Nodeformer: A scalable graph structure learning transformer for node classification

Q Wu, W Zhao, Z Li, DP Wipf… - Advances in Neural …, 2022 - proceedings.neurips.cc
Graph neural networks have been extensively studied for learning with inter-connected data.
Despite this, recent evidence has revealed GNNs' deficiencies related to over-squashing …

One embedder, any task: Instruction-finetuned text embeddings

H Su, W Shi, J Kasai, Y Wang, Y Hu… - arxiv preprint arxiv …, 2022 - arxiv.org
We introduce INSTRUCTOR, a new method for computing text embeddings given task
instructions: every text input is embedded together with instructions explaining the use case …

How attentive are graph attention networks?

S Brody, U Alon, E Yahav - arxiv preprint arxiv:2105.14491, 2021 - arxiv.org
Graph Attention Networks (GATs) are one of the most popular GNN architectures and are
considered as the state-of-the-art architecture for representation learning with graphs. In …

A comprehensive survey on deep graph representation learning

W Ju, Z Fang, Y Gu, Z Liu, Q Long, Z Qiao, Y Qin… - Neural Networks, 2024 - Elsevier
Graph representation learning aims to effectively encode high-dimensional sparse graph-
structured data into low-dimensional dense vectors, which is a fundamental task that has …