Prati
Can Xu
Can Xu
Microsoft AI
Potvrđena adresa e-pošte na microsoft.com - Početna stranica
Naslov
Citirano
Citirano
Godina
Phi-3 technical report: A highly capable language model locally on your phone
M Abdin, J Aneja, H Awadalla, A Awadallah, AA Awan, N Bach, A Bahree, ...
arXiv preprint arXiv:2404.14219, 2024
8612024
Wizardlm: Empowering large language models to follow complex instructions
C Xu, Q Sun, K Zheng, X Geng, P Zhao, J Feng, C Tao, D Jiang
ICLR-2024, 2023
851*2023
Wizardcoder: Empowering code large language models with evol-instruct
Z Luo, C Xu, P Zhao, Q Sun, X Geng, W Hu, C Tao, J Ma, Q Lin, D Jiang
ICLR-2024, 2023
5722023
Wizardmath: Empowering mathematical reasoning for large language models via reinforced evol-instruct
H Luo, Q Sun, C Xu, P Zhao, J Lou, C Tao, X Geng, Q Lin, S Chen, ...
arXiv preprint arXiv:2308.09583, 2023
3552023
Knowledge-grounded dialogue generation with pre-trained language models
X Zhao, W Wu, C Xu, C Tao, D Zhao, R Yan
EMNLP 2020, 2020
2232020
Multi-representation fusion network for multi-turn response selection in retrieval-based chatbots
C Tao, W Wu, C Xu, W Hu, D Zhao, R Yan
Proceedings of the twelfth ACM international conference on web search and …, 2019
1572019
One time of interaction may not be enough: Go deep with an interaction-over-interaction network for response selection in dialogues
C Tao, W Wu, C Xu, W Hu, D Zhao, R Yan
Proceedings of the 57th annual meeting of the association for computational …, 2019
1402019
A survey on knowledge distillation of large language models
X Xu, M Li, C Tao, T Shen, R Cheng, J Li, C Xu, D Tao, T Zhou
arXiv preprint arXiv:2402.13116, 2024
1332024
Low-Resource Knowledge-Grounded Dialogue Generation
X Zhao, W Wu, C Tao, C Xu, D Zhao, R Yan
ICLR-2020, 2020
1172020
A sequential matching framework for multi-turn response selection in retrieval-based chatbots
Y Wu, W Wu, C Xing, C Xu, Z Li, M Zhou
Computational Linguistics 45 (1), 163-197, 2019
972019
PromDA: Prompt-based Data Augmentation for Low-Resource NLU Tasks
Y Wang, C Xu, Q Sun, H Hu, C Tao, X Geng, D Jiang
ACL 2022, 2022
922022
Neural response generation with dynamic vocabularies
Y Wu, W Wu, D Yang, C Xu, Z Li, M Zhou
AAAI-2018, 2017
822017
Zero-Resource Knowledge-Grounded Dialogue Generation
L Li, C Xu, W Wu, Y Zhao, X Zhao, C Tao
NeurIPS 2020, 2020
762020
ProphetNet-X: Large-Scale Pre-training Models for English, Chinese, Multi-lingual, Dialog, and Code Generation
W Qi, Y Gong, Y Yan, C Xu, B Yao, B Zhou, B Cheng, D Jiang, J Chen, ...
Demo of ACL 2021, 2021
602021
MPC-BERT: A Pre-Trained Language Model for Multi-Party Conversation Understanding
JC Gu, C Tao, ZH Ling, C Xu, X Geng, D Jiang
ACL 2021, 2021
552021
Multimodal Dialogue Response Generation
Q Sun, Y Wang, C Xu, K Zheng, Y Yang, H Hu, F Xu, J Zhang, X Geng, ...
ACL 2022, 2021
532021
Mmdialog: A large-scale multi-turn dialogue dataset towards multi-modal open-domain conversation
J Feng, Q Sun, C Xu, P Zhao, Y Yang, C Tao, D Zhao, Q Lin
ACL-2023, 2022
492022
Towards robust ranker for text retrieval
Y Zhou, T Shen, X Geng, C Tao, C Xu, G Long, B Jiao, D Jiang
arXiv preprint arXiv:2206.08063, 2022
492022
Learning Neural Templates for Recommender Dialogue System
Z Liang, H Hu, C Xu, J Miao, Y He, Y Chen, X Geng, F Liang, D Jiang
EMNLP 2021, 2021
482021
Leveraging large language models for nlg evaluation: A survey
Z Li, X Xu, T Shen, C Xu, JC Gu, C Tao
arXiv e-prints, arXiv: 2401.07103, 2024
462024
Sustav trenutno ne može provesti ovu radnju. Pokušajte ponovo kasnije.
Članci 1–20