팔로우
Can Xu
Can Xu
Microsoft AI
microsoft.com의 이메일 확인됨 - 홈페이지
제목
인용
인용
연도
Wizardlm: Empowering large language models to follow complex instructions
C Xu, Q Sun, K Zheng, X Geng, P Zhao, J Feng, C Tao, D Jiang
ICLR-2024, 2023
837*2023
Phi-3 technical report: A highly capable language model locally on your phone
M Abdin, J Aneja, H Awadalla, A Awadallah, AA Awan, N Bach, A Bahree, ...
arXiv preprint arXiv:2404.14219, 2024
7922024
Wizardcoder: Empowering code large language models with evol-instruct
Z Luo, C Xu, P Zhao, Q Sun, X Geng, W Hu, C Tao, J Ma, Q Lin, D Jiang
ICLR-2024, 2023
5592023
Wizardmath: Empowering mathematical reasoning for large language models via reinforced evol-instruct
H Luo, Q Sun, C Xu, P Zhao, J Lou, C Tao, X Geng, Q Lin, S Chen, ...
arXiv preprint arXiv:2308.09583, 2023
3382023
Knowledge-grounded dialogue generation with pre-trained language models
X Zhao, W Wu, C Xu, C Tao, D Zhao, R Yan
EMNLP 2020, 2020
2262020
Multi-representation fusion network for multi-turn response selection in retrieval-based chatbots
C Tao, W Wu, C Xu, W Hu, D Zhao, R Yan
Proceedings of the twelfth ACM international conference on web search and …, 2019
1562019
One time of interaction may not be enough: Go deep with an interaction-over-interaction network for response selection in dialogues
C Tao, W Wu, C Xu, W Hu, D Zhao, R Yan
Proceedings of the 57th annual meeting of the association for computational …, 2019
1402019
A survey on knowledge distillation of large language models
X Xu, M Li, C Tao, T Shen, R Cheng, J Li, C Xu, D Tao, T Zhou
arXiv preprint arXiv:2402.13116, 2024
1202024
Low-Resource Knowledge-Grounded Dialogue Generation
X Zhao, W Wu, C Tao, C Xu, D Zhao, R Yan
ICLR-2020, 2020
1172020
A sequential matching framework for multi-turn response selection in retrieval-based chatbots
Y Wu, W Wu, C Xing, C Xu, Z Li, M Zhou
Computational Linguistics 45 (1), 163-197, 2019
962019
PromDA: Prompt-based Data Augmentation for Low-Resource NLU Tasks
Y Wang, C Xu, Q Sun, H Hu, C Tao, X Geng, D Jiang
ACL 2022, 2022
892022
Neural response generation with dynamic vocabularies
Y Wu, W Wu, D Yang, C Xu, Z Li, M Zhou
AAAI-2018, 2017
832017
Zero-Resource Knowledge-Grounded Dialogue Generation
L Li, C Xu, W Wu, Y Zhao, X Zhao, C Tao
NeurIPS 2020, 2020
802020
ProphetNet-X: Large-Scale Pre-training Models for English, Chinese, Multi-lingual, Dialog, and Code Generation
W Qi, Y Gong, Y Yan, C Xu, B Yao, B Zhou, B Cheng, D Jiang, J Chen, ...
Demo of ACL 2021, 2021
622021
MPC-BERT: A Pre-Trained Language Model for Multi-Party Conversation Understanding
JC Gu, C Tao, ZH Ling, C Xu, X Geng, D Jiang
ACL 2021, 2021
542021
Multimodal Dialogue Response Generation
Q Sun, Y Wang, C Xu, K Zheng, Y Yang, H Hu, F Xu, J Zhang, X Geng, ...
ACL 2022, 2021
532021
Towards robust ranker for text retrieval
Y Zhou, T Shen, X Geng, C Tao, C Xu, G Long, B Jiao, D Jiang
arXiv preprint arXiv:2206.08063, 2022
502022
Learning Neural Templates for Recommender Dialogue System
Z Liang, H Hu, C Xu, J Miao, Y He, Y Chen, X Geng, F Liang, D Jiang
EMNLP 2021, 2021
482021
Mmdialog: A large-scale multi-turn dialogue dataset towards multi-modal open-domain conversation
J Feng, Q Sun, C Xu, P Zhao, Y Yang, C Tao, D Zhao, Q Lin
ACL-2023, 2022
462022
A document-grounded matching network for response selection in retrieval-based chatbots
X Zhao, C Tao, W Wu, C Xu, D Zhao, R Yan
IJCAI-2019, 2019
462019
현재 시스템이 작동되지 않습니다. 나중에 다시 시도해 주세요.
학술자료 1–20