Data-centric artificial intelligence: A survey

D Zha, ZP Bhat, KH Lai, F Yang, Z Jiang… - ACM Computing …, 2025 - dl.acm.org
Artificial Intelligence (AI) is making a profound impact in almost every domain. A vital enabler
of its great success is the availability of abundant and high-quality data for building machine …

Data-centric ai: Perspectives and challenges

D Zha, ZP Bhat, KH Lai, F Yang, X Hu - Proceedings of the 2023 SIAM …, 2023 - SIAM
The role of data in building AI systems has recently been significantly magnified by the
emerging concept of data-centric AI (DCAI), which advocates a fundamental shift from model …

A survey on oversmoothing in graph neural networks

TK Rusch, MM Bronstein, S Mishra - arxiv preprint arxiv:2303.10993, 2023 - arxiv.org
Node features of graph neural networks (GNNs) tend to become more similar with the
increase of the network depth. This effect is known as over-smoothing, which we …

Gppt: Graph pre-training and prompt tuning to generalize graph neural networks

M Sun, K Zhou, X He, Y Wang, X Wang - Proceedings of the 28th ACM …, 2022 - dl.acm.org
Despite the promising representation learning of graph neural networks (GNNs), the
supervised training of GNNs notoriously requires large amounts of labeled data from each …

Convolutional neural networks on graphs with chebyshev approximation, revisited

M He, Z Wei, JR Wen - Advances in neural information …, 2022 - proceedings.neurips.cc
Designing spectral convolutional networks is a challenging problem in graph learning.
ChebNet, one of the early attempts, approximates the spectral graph convolutions using …

How universal polynomial bases enhance spectral graph neural networks: Heterophily, over-smoothing, and over-squashing

K Huang, YG Wang, M Li - arxiv preprint arxiv:2405.12474, 2024 - arxiv.org
Spectral Graph Neural Networks (GNNs), alternatively known as graph filters, have gained
increasing prevalence for heterophily graphs. Optimal graph filters rely on Laplacian …

Pc-conv: Unifying homophily and heterophily with two-fold filtering

B Li, E Pan, Z Kang - Proceedings of the AAAI conference on artificial …, 2024 - ojs.aaai.org
Recently, many carefully designed graph representation learning methods have achieved
impressive performance on either strong heterophilic or homophilic graphs, but not both …

A comprehensive study on large-scale graph training: Benchmarking and rethinking

K Duan, Z Liu, P Wang, W Zheng… - Advances in …, 2022 - proceedings.neurips.cc
Large-scale graph training is a notoriously challenging problem for graph neural networks
(GNNs). Due to the nature of evolving graph structures into the training process, vanilla …

D4explainer: In-distribution explanations of graph neural network via discrete denoising diffusion

J Chen, S Wu, A Gupta, R Ying - Advances in Neural …, 2023 - proceedings.neurips.cc
The widespread deployment of Graph Neural Networks (GNNs) sparks significant interest in
their explainability, which plays a vital role in model auditing and ensuring trustworthy graph …

Anti-symmetric DGN: a stable architecture for deep graph networks

A Gravina, D Bacciu, C Gallicchio - arxiv preprint arxiv:2210.09789, 2022 - arxiv.org
Deep Graph Networks (DGNs) currently dominate the research landscape of learning from
graphs, due to their efficiency and ability to implement an adaptive message-passing …