Segueix
Junyi Li
Junyi Li
Correu electrònic verificat a umontreal.ca
Títol
Citada per
Citada per
Any
A survey of large language models
WX Zhao, K Zhou, J Li, T Tang, X Wang, Y Hou, Y Min, B Zhang, J Zhang, ...
arXiv preprint arXiv:2303.18223, 2023
4266*2023
Pre-trained language models for text generation: A survey
J Li, T Tang, WX Zhao, JY Nie, JR Wen
ACM Computing Surveys 56 (9), 1-39, 2024
4472024
Halueval: A large-scale hallucination evaluation benchmark for large language models
J Li, X Cheng, WX Zhao, JY Nie, JR Wen
Proceedings of the 2023 Conference on Empirical Methods in Natural Language …, 2023
3762023
A survey of vision-language pre-trained models
Y Du, Z Liu, J Li, WX Zhao
arXiv preprint arXiv:2202.10936, 2022
2272022
WenLan: Bridging vision and language by large-scale multi-modal pre-training
Y Huo, M Zhang, G Liu, H Lu, Y Gao, G Yang, J Wen, H Zhang, B Xu, ...
arXiv preprint arXiv:2103.06561, 2021
1462021
The dawn after the dark: An empirical study on factuality hallucination in large language models
J Li, J Chen, R Ren, X Cheng, WX Zhao, JY Nie, JR Wen
arXiv preprint arXiv:2401.03205, 2024
652024
A survey on long text modeling with transformers
Z Dong, T Tang, L Li, WX Zhao
arXiv preprint arXiv:2302.14502, 2023
582023
Few-shot knowledge graph-to-text generation with pretrained language models
J Li, T Tang, WX Zhao, Z Wei, NJ Yuan, JR Wen
Findings of The 59th Annual Meeting of the Association for Computational …, 2021
562021
Bamboo: A comprehensive benchmark for evaluating long text modeling capacities of large language models
Z Dong, T Tang, J Li, WX Zhao, JR Wen
arXiv preprint arXiv:2309.13345, 2023
512023
Mining implicit entity preference from user-item interaction data for knowledge graph completion via adversarial learning
G He, J Li, WX Zhao, P Liu, JR Wen
Proceedings of the web conference 2020, 740-751, 2020
472020
Knowledge-enhanced personalized review generation with capsule graph neural network
J Li, S Li, WX Zhao, G He, Z Wei, NJ Yuan, JR Wen
Proceedings of the 29th ACM International Conference on Information …, 2020
412020
Textbox 2.0: A text generation library with pre-trained language models
T Tang, J Li, Z Chen, Y Hu, Z Yu, W Dai, Z Dong, X Cheng, Y Wang, ...
arXiv preprint arXiv:2212.13005, 2022
40*2022
Learning to Transfer Prompts for Text Generation
J Li, T Tang, JY Nie, JR Wen, WX Zhao
NAACL 2022, 2022
392022
Generating long and informative reviews with aspect-aware coarse-to-fine decoding
J Li, WX Zhao, JR Wen, Y Song
The 57th Annual Meeting of the Association for Computational Linguistics (ACL), 2019
392019
Mvp: Multi-task supervised pre-training for natural language generation
T Tang, J Li, WX Zhao, JR Wen
arXiv preprint arXiv:2206.12131, 2022
382022
Context-tuning: Learning contextualized prompts for natural language generation
T Tang, J Li, WX Zhao, JR Wen
arXiv preprint arXiv:2201.08670, 2022
302022
Knowledge-based review generation by coherence enhanced text planning
J Li, WX Zhao, Z Wei, NJ Yuan, JR Wen
The 44th International ACM SIGIR Conference on Research and Development in …, 2021
262021
The Web Can Be Your Oyster for Improving Large Language Models
J Li, T Tang, WX Zhao, J Wang, JY Nie, JR Wen
arXiv preprint arXiv:2305.10998, 2023
17*2023
ELMER: A non-autoregressive pre-trained language model for efficient and effective text generation
J Li, T Tang, WX Zhao, JY Nie, JR Wen
arXiv preprint arXiv:2210.13304, 2022
172022
REAR: A Relevance-Aware Retrieval-Augmented Framework for Open-Domain Question Answering
Y Wang, R Ren, J Li, WX Zhao, J Liu, JR Wen
arXiv preprint arXiv:2402.17497, 2024
142024
En aquests moments el sistema no pot dur a terme l'operació. Torneu-ho a provar més tard.
Articles 1–20