Theo dõi
Tianyi Tang
Tianyi Tang
Tên khác唐天一
Qwen Team, Alibaba Group & Renmin University of China
Email được xác minh tại alibaba-inc.com - Trang chủ
Tiêu đề
Trích dẫn bởi
Trích dẫn bởi
Năm
A survey of large language models
WX Zhao, K Zhou, J Li, T Tang, X Wang, Y Hou, Y Min, B Zhang, J Zhang, ...
arXiv preprint arXiv:2303.18223 1 (2), 2023
4407*2023
Qwen2. 5 Technical Report
A Yang, B Yang, B Zhang, B Hui, B Zheng, B Yu, C Li, D Liu, F Huang, ...
arXiv preprint arXiv:2412.15115, 2024
1082*2024
A survey of pretrained language models based text generation
J Li, T Tang, WX Zhao, JY Nie, JR Wen
arXiv preprint arXiv:2201.05273, 2022
453*2022
Not all languages are created equal in llms: Improving multilingual capability by cross-lingual-thought prompting
H Huang, T Tang, D Zhang, WX Zhao, T Song, Y Xia, F Wei
arXiv preprint arXiv:2305.07004, 2023
1052023
A survey on long text modeling with transformers
Z Dong, T Tang, L Li, WX Zhao
arXiv preprint arXiv:2302.14502, 2023
572023
Bamboo: A comprehensive benchmark for evaluating long text modeling capacities of large language models
Z Dong, T Tang, J Li, WX Zhao, JR Wen
arXiv preprint arXiv:2309.13345, 2023
552023
Few-shot knowledge graph-to-text generation with pretrained language models
J Li, T Tang, WX Zhao, Z Wei, NJ Yuan, JR Wen
arXiv preprint arXiv:2106.01623, 2021
542021
Language-specific neurons: The key to multilingual capabilities in large language models
T Tang, W Luo, H Huang, D Zhang, X Wang, X Zhao, F Wei, JR Wen
arXiv preprint arXiv:2402.16438, 2024
502024
Learning to transfer prompts for text generation
J Li, T Tang, JY Nie, JR Wen, WX Zhao
arXiv preprint arXiv:2205.01543, 2022
422022
Textbox 2.0: A text generation library with pre-trained language models
T Tang, J Li, Z Chen, Y Hu, Z Yu, W Dai, Z Dong, X Cheng, Y Wang, ...
arXiv preprint arXiv:2212.13005, 2022
40*2022
Mvp: Multi-task supervised pre-training for natural language generation
T Tang, J Li, WX Zhao, JR Wen
arXiv preprint arXiv:2206.12131, 2022
362022
Context-tuning: Learning contextualized prompts for natural language generation
T Tang, J Li, WX Zhao, JR Wen
arXiv preprint arXiv:2201.08670, 2022
322022
The web can be your oyster for improving large language models
J Li, T Tang, WX Zhao, J Wang, JY Nie, JR Wen
arXiv preprint arXiv:2305.10998, 2023
21*2023
Beyond imitation: Leveraging fine-grained quality signals for alignment
G Guo, R Zhao, T Tang, WX Zhao, JR Wen
arXiv preprint arXiv:2311.04072, 2023
202023
ELMER: A non-autoregressive pre-trained language model for efficient and effective text generation
J Li, T Tang, WX Zhao, JY Nie, JR Wen
arXiv preprint arXiv:2210.13304, 2022
182022
Not all metrics are guilty: Improving nlg evaluation by diversifying references
T Tang, H Lu, YE Jiang, H Huang, D Zhang, WX Zhao, T Kocmi, F Wei
arXiv preprint arXiv:2305.15067, 2023
16*2023
Learning to imagine: Visually-augmented natural language generation
T Tang, Y Chen, Y Du, J Li, WX Zhao, JR Wen
arXiv preprint arXiv:2305.16944, 2023
132023
Zero-shot visual question answering with language model feedback
Y Du, J Li, T Tang, WX Zhao, JR Wen
arXiv preprint arXiv:2305.17006, 2023
92023
Eliteplm: an empirical study on general language ability evaluation of pretrained language models
J Li, T Tang, Z Gong, L Yang, Z Yu, Z Chen, J Wang, WX Zhao, JR Wen
arXiv preprint arXiv:2205.01523, 2022
62022
Towards effective ancient Chinese translation: dataset, model, and evaluation
G Guo, J Yang, F Lu, J Qin, T Tang, WX Zhao
CCF International Conference on Natural Language Processing and Chinese …, 2023
52023
Hệ thống không thể thực hiện thao tác ngay bây giờ. Hãy thử lại sau.
Bài viết 1–20