Knowledge distillation on graphs: A survey

Y Tian, S Pei, X Zhang, C Zhang, N Chawla - ACM Computing Surveys, 2023‏ - dl.acm.org
Graph Neural Networks (GNNs) have received significant attention for demonstrating their
capability to handle graph data. However, they are difficult to be deployed in resource …

Demystifying structural disparity in graph neural networks: Can one size fit all?

H Mao, Z Chen, W **, H Han, Y Ma… - Advances in neural …, 2024‏ - proceedings.neurips.cc
Abstract Recent studies on Graph Neural Networks (GNNs) provide both empirical and
theoretical evidence supporting their effectiveness in capturing structural patterns on both …

Trustworthy graph neural networks: Aspects, methods and trends

H Zhang, B Wu, X Yuan, S Pan, H Tong… - arxiv preprint arxiv …, 2022‏ - arxiv.org
Graph neural networks (GNNs) have emerged as a series of competent graph learning
methods for diverse real-world scenarios, ranging from daily applications like …

Knowledge distillation improves graph structure augmentation for graph neural networks

L Wu, H Lin, Y Huang, SZ Li - Advances in Neural …, 2022‏ - proceedings.neurips.cc
Graph (structure) augmentation aims to perturb the graph structure through heuristic or
probabilistic rules, enabling the nodes to capture richer contextual information and thus …

Towards graph foundation models: A survey and beyond

J Liu, C Yang, Z Lu, J Chen, Y Li, M Zhang… - arxiv preprint arxiv …, 2023‏ - arxiv.org
Emerging as fundamental building blocks for diverse artificial intelligence applications,
foundation models have achieved notable success across natural language processing and …

Graph attention multi-layer perceptron

W Zhang, Z Yin, Z Sheng, Y Li, W Ouyang, X Li… - Proceedings of the 28th …, 2022‏ - dl.acm.org
Graph neural networks (GNNs) have achieved great success in many graph-based
applications. However, the enormous size and high sparsity level of graphs hinder their …

Quantifying the knowledge in gnns for reliable distillation into mlps

L Wu, H Lin, Y Huang, SZ Li - International Conference on …, 2023‏ - proceedings.mlr.press
To bridge the gaps between topology-aware Graph Neural Networks (GNNs) and inference-
efficient Multi-Layer Perceptron (MLPs), GLNN proposes to distill knowledge from a well …

Symbolic knowledge extraction and injection with sub-symbolic predictors: A systematic literature review

G Ciatto, F Sabbatini, A Agiollo, M Magnini… - ACM Computing …, 2024‏ - dl.acm.org
In this article, we focus on the opacity issue of sub-symbolic machine learning predictors by
promoting two complementary activities—symbolic knowledge extraction (SKE) and …

Linkless link prediction via relational distillation

Z Guo, W Shiao, S Zhang, Y Liu… - International …, 2023‏ - proceedings.mlr.press
Abstract Graph Neural Networks (GNNs) have shown exceptional performance in the task of
link prediction. Despite their effectiveness, the high latency brought by non-trivial …

Learning mlps on graphs: A unified view of effectiveness, robustness, and efficiency

Y Tian, C Zhang, Z Guo, X Zhang… - … Conference on Learning …, 2022‏ - openreview.net
While Graph Neural Networks (GNNs) have demonstrated their efficacy in dealing with non-
Euclidean structural data, they are difficult to be deployed in real applications due to the …