Suivre
Longyue Wang
Longyue Wang
Alibaba Group
Adresse e-mail validée de alibaba-inc.com - Page d'accueil
Titre
Citée par
Citée par
Année
Siren's song in the AI ocean: a survey on hallucination in large language models
Y Zhang, Y Li, L Cui, D Cai, L Liu, T Fu, X Huang, E Zhao, Y Zhang, ...
arXiv preprint arXiv:2309.01219, 2023
10062023
Exploiting Cross-Sentence Context for Neural Machine Translation
QL Longyue Wang, Zhaopeng Tu, Andy Way
Proceedings of the 2017 Conference on Empirical Methods in Natural Language …, 2017
238*2017
Document-Level Machine Translation with Large Language Models
ZT Longyue Wang, Chenyang Lyu, Tianbo Ji, Zhirui Zhang, Dian Yu, Shuming Shi
Proceedings of the 2023 Conference on Empirical Methods in Natural Language …, 2023
178*2023
Macaw-llm: Multi-modal language modeling with image, audio, video, and text integration
C Lyu, M Wu, L Wang, X Huang, B Liu, Z Du, S Shi, Z Tu
arXiv preprint arXiv:2306.09093, 2023
1652023
Convolutional Self-Attention Networks
ZT Baosong Yang, Longyue Wang, Derek F. Wong, Lidia S. Chao
Proceedings of the 2019 Conference of the North American Chapter of the …, 2019
151*2019
UM-Corpus: A Large English-Chinese Parallel Corpus for Statistical Machine Translation.
L Tian, DF Wong, LS Chao, P Quaresma, F Oliveira, L Yi, S Li, Y Wang, ...
LREC, 1837-1842, 2014
1412014
Understanding and Improving Lexical Choice in Non-Autoregressive Translation
ZT Liang Ding, Longyue Wang, Xuebo Liu, Derek F Wong, Dacheng Tao
ICLR 2021, 2021
119*2021
Modeling recurrence for transformer
J Hao, X Wang, B Yang, L Wang, J Zhang, Z Tu
arXiv preprint arXiv:1904.03092, 2019
932019
Self-attention with structural position representations
X Wang, Z Tu, L Wang, S Shi
arXiv preprint arXiv:1909.00383, 2019
812019
Self-attention with cross-lingual position representation
L Ding, L Wang, D Tao
arXiv preprint arXiv:2004.13310, 2020
762020
New trends in machine translation using large language models: Case examples with chatgpt
C Lyu, J Xu, L Wang
arXiv preprint arXiv:2305.01181, 2023
722023
Context-aware cross-attention for non-autoregressive translation
L Ding, L Wang, D Wu, D Tao, Z Tu
arXiv preprint arXiv:2011.00770, 2020
712020
Rejuvenating low-frequency words: Making the most of parallel data in non-autoregressive translation
L Ding, L Wang, X Liu, DF Wong, D Tao, Z Tu
arXiv preprint arXiv:2106.00903, 2021
642021
Redistributing low-frequency words: Making the most of monolingual data in non-autoregressive translation
L Ding, L Wang, S Shi, D Tao, Z Tu
Proceedings of the 60th Annual Meeting of the Association for Computational …, 2022
602022
Progressive multi-granularity training for non-autoregressive translation
L Ding, L Wang, X Liu, DF Wong, D Tao, Z Tu
arXiv preprint arXiv:2106.05546, 2021
552021
Dynamic layer aggregation for neural machine translation with routing-by-agreement
ZY Dou, Z Tu, X Wang, L Wang, S Shi, T Zhang
Proceedings of the AAAI Conference on Artificial Intelligence 33 (01), 86-93, 2019
552019
Deepfake text detection in the wild
Y Li, Q Li, L Cui, W Bi, L Wang, L Yang, S Shi, Y Zhang
arXiv preprint arXiv:2305.13242, 2023
502023
Context-aware self-attention networks for natural language processing
B Yang, L Wang, DF Wong, S Shi, Z Tu
Neurocomputing 458, 157-169, 2021
502021
Translating pro-drop languages with reconstruction models
L Wang, Z Tu, S Shi, T Zhang, Y Graham, Q Liu
Proceedings of the AAAI Conference on Artificial Intelligence 32 (1), 2018
502018
Large language models meet harry potter: A dataset for aligning dialogue agents with characters
N Chen, Y Wang, H Jiang, D Cai, Y Li, Z Chen, L Wang, J Li
Findings of the Association for Computational Linguistics: EMNLP 2023, 8506-8520, 2023
48*2023
Le système ne peut pas réaliser cette opération maintenant. Veuillez réessayer plus tard.
Articles 1–20