팔로우
Xiaohan Xu
Xiaohan Xu
connect.hku.hk의 이메일 확인됨 - 홈페이지
제목
인용
인용
연도
A survey on knowledge distillation of large language models
X Xu, M Li, C Tao, T Shen, R Cheng, J Li, C Xu, D Tao, T Zhou
arXiv preprint arXiv:2402.13116, 2024
1202024
Subgraph Neighboring Relations Infomax for Inductive Link Prediction on Knowledge Graphs
X Xu, P Zhang, Y He, C Chao, C Yan
IJCAI 2022, 2022
632022
Leveraging large language models for nlg evaluation: Advances and challenges
Z Li, X Xu, T Shen, C Xu, JC Gu, Y Lai, C Tao, S Ma
Proceedings of the 2024 Conference on Empirical Methods in Natural Language …, 2024
54*2024
Cross-modal contrastive learning for multimodal fake news detection
L Wang, C Zhang, H Xu, Y Xu, X Xu, S Wang
Proceedings of the 31st ACM international conference on multimedia, 5696-5704, 2023
492023
Re-reading improves reasoning in large language models
X Xu, C Tao, T Shen, C Xu, H Xu, G Long, JG Lou, S Ma
Proceedings of the 2024 Conference on Empirical Methods in Natural Language …, 2024
39*2024
Mmidr: Teaching large language model to interpret multimodal misinformation via knowledge distillation
L Wang, X Xu, L Zhang, J Lu, Y Xu, H Xu, M Tang, C Zhang
arXiv preprint arXiv:2403.14171, 2024
62024
현재 시스템이 작동되지 않습니다. 나중에 다시 시도해 주세요.
학술자료 1–6