A survey on knowledge distillation of large language models X Xu, M Li, C Tao, T Shen, R Cheng, J Li, C Xu, D Tao, T Zhou arXiv preprint arXiv:2402.13116, 2024 | 120 | 2024 |
Subgraph Neighboring Relations Infomax for Inductive Link Prediction on Knowledge Graphs X Xu, P Zhang, Y He, C Chao, C Yan IJCAI 2022, 2022 | 63 | 2022 |
Leveraging large language models for nlg evaluation: Advances and challenges Z Li, X Xu, T Shen, C Xu, JC Gu, Y Lai, C Tao, S Ma Proceedings of the 2024 Conference on Empirical Methods in Natural Language …, 2024 | 54* | 2024 |
Cross-modal contrastive learning for multimodal fake news detection L Wang, C Zhang, H Xu, Y Xu, X Xu, S Wang Proceedings of the 31st ACM international conference on multimedia, 5696-5704, 2023 | 49 | 2023 |
Re-reading improves reasoning in large language models X Xu, C Tao, T Shen, C Xu, H Xu, G Long, JG Lou, S Ma Proceedings of the 2024 Conference on Empirical Methods in Natural Language …, 2024 | 39* | 2024 |
Mmidr: Teaching large language model to interpret multimodal misinformation via knowledge distillation L Wang, X Xu, L Zhang, J Lu, Y Xu, H Xu, M Tang, C Zhang arXiv preprint arXiv:2403.14171, 2024 | 6 | 2024 |