NAGphormer: A Tokenized Graph Transformer for Node Classification in Large Graphs J Chen, K Gao, G Li, K He The Eleventh International Conference on Learning Representations, 2023 | 167 | 2023 |
Structural Robust Label Propagation on Homogeneous Graphs Q He, J Chen, H Xu, K He 2022 IEEE International Conference on Data Mining (ICDM), 181-190, 2022 | 12 | 2022 |
Neighborhood Convolutional Graph Neural Network J Chen, B Li, K He Knowledge-Based Systems, 111861, 2024 | 11 | 2024 |
SignGT: Signed Attention-based Graph Transformer for Graph Representation Learning J Chen, G Li, JE Hopcroft, K He arXiv preprint arXiv:2310.11025, 2023 | 9 | 2023 |
NAGphormer+: A Tokenized Graph Transformer With Neighborhood Augmentation for Node Classification in Large Graphs J Chen, C Liu, K Gao, G Li, K He IEEE Transactions on Big Data, 2024 | 6* | 2024 |
PAMT: A Novel Propagation-based Approach via Adaptive Similarity Mask for Node Classification J Chen, B Li, Q He, K He IEEE Transactions on Computational Social Systems, 2024 | 6 | 2024 |
NTFormer: A Composite Node Tokenized Graph Transformer for Node Classification J Chen, S Jiang, K He arXiv preprint arXiv:2406.19249, 2024 | 5 | 2024 |
Leveraging Contrastive Learning for Enhanced Node Representations in Tokenized Graph Transformers J Chen, H Liu, JE Hopcroft, K He The Thirty-eighth Annual Conference on Neural Information Processing Systems, 2024 | 4 | 2024 |
Adaptive multi-neighborhood attention based transformer for graph representation learning G Li, J Chen, K He arXiv preprint arXiv:2211.07970, 2022 | 2 | 2022 |
Mixture of Decoupled Message Passing Experts with Entropy Constraint for General Node Classification X Chen, J Zhou, J Chen, S Yu, Q Xuan arXiv preprint arXiv:2502.08083, 2025 | | 2025 |
Rethinking Tokenized Graph Transformers for Node Classification J Chen, C Li, G Li, JE Hopcroft, K He arXiv preprint arXiv:2502.08101, 2025 | | 2025 |
Diversified Node Sampling based Hierarchical Transformer Pooling for Graph Representation Learning G Li, J Chen, JE Hopcroft, K He arXiv preprint arXiv:2310.20250, 2023 | | 2023 |