Följ
Longyue Wang
Longyue Wang
Alibaba Group
Verifierad e-postadress på alibaba-inc.com - Startsida
Titel
Citeras av
Citeras av
År
Siren's song in the AI ocean: a survey on hallucination in large language models
Y Zhang, Y Li, L Cui, D Cai, L Liu, T Fu, X Huang, E Zhao, Y Zhang, ...
arXiv preprint arXiv:2309.01219, 2023
9812023
Exploiting Cross-Sentence Context for Neural Machine Translation
QL Longyue Wang, Zhaopeng Tu, Andy Way
Proceedings of the 2017 Conference on Empirical Methods in Natural Language …, 2017
240*2017
Document-Level Machine Translation with Large Language Models
ZT Longyue Wang, Chenyang Lyu, Tianbo Ji, Zhirui Zhang, Dian Yu, Shuming Shi
Proceedings of the 2023 Conference on Empirical Methods in Natural Language …, 2023
175*2023
Macaw-llm: Multi-modal language modeling with image, audio, video, and text integration
C Lyu, M Wu, L Wang, X Huang, B Liu, Z Du, S Shi, Z Tu
arXiv preprint arXiv:2306.09093, 2023
1612023
Convolutional Self-Attention Networks
ZT Baosong Yang, Longyue Wang, Derek F. Wong, Lidia S. Chao
Proceedings of the 2019 Conference of the North American Chapter of the …, 2019
151*2019
UM-Corpus: A Large English-Chinese Parallel Corpus for Statistical Machine Translation.
L Tian, DF Wong, LS Chao, P Quaresma, F Oliveira, L Yi, S Li, Y Wang, ...
LREC, 1837-1842, 2014
1402014
Understanding and Improving Lexical Choice in Non-Autoregressive Translation
ZT Liang Ding, Longyue Wang, Xuebo Liu, Derek F Wong, Dacheng Tao
ICLR 2021, 2021
119*2021
Modeling recurrence for transformer
J Hao, X Wang, B Yang, L Wang, J Zhang, Z Tu
arXiv preprint arXiv:1904.03092, 2019
922019
Self-attention with structural position representations
X Wang, Z Tu, L Wang, S Shi
arXiv preprint arXiv:1909.00383, 2019
812019
Self-attention with cross-lingual position representation
L Ding, L Wang, D Tao
arXiv preprint arXiv:2004.13310, 2020
742020
New trends in machine translation using large language models: Case examples with chatgpt
C Lyu, J Xu, L Wang
arXiv preprint arXiv:2305.01181, 2023
702023
Context-aware cross-attention for non-autoregressive translation
L Ding, L Wang, D Wu, D Tao, Z Tu
arXiv preprint arXiv:2011.00770, 2020
692020
Rejuvenating low-frequency words: Making the most of parallel data in non-autoregressive translation
L Ding, L Wang, X Liu, DF Wong, D Tao, Z Tu
arXiv preprint arXiv:2106.00903, 2021
642021
Redistributing low-frequency words: Making the most of monolingual data in non-autoregressive translation
L Ding, L Wang, S Shi, D Tao, Z Tu
Proceedings of the 60th Annual Meeting of the Association for Computational …, 2022
602022
Dynamic layer aggregation for neural machine translation with routing-by-agreement
ZY Dou, Z Tu, X Wang, L Wang, S Shi, T Zhang
Proceedings of the AAAI Conference on Artificial Intelligence 33 (01), 86-93, 2019
552019
Progressive multi-granularity training for non-autoregressive translation
L Ding, L Wang, X Liu, DF Wong, D Tao, Z Tu
arXiv preprint arXiv:2106.05546, 2021
522021
Context-aware self-attention networks for natural language processing
B Yang, L Wang, DF Wong, S Shi, Z Tu
Neurocomputing 458, 157-169, 2021
502021
Translating pro-drop languages with reconstruction models
L Wang, Z Tu, S Shi, T Zhang, Y Graham, Q Liu
Proceedings of the AAAI Conference on Artificial Intelligence 32 (1), 2018
502018
Deepfake text detection in the wild
Y Li, Q Li, L Cui, W Bi, L Wang, L Yang, S Shi, Y Zhang
arXiv preprint arXiv:2305.13242, 2023
492023
Towards understanding neural machine translation with word importance
S He, Z Tu, X Wang, L Wang, MR Lyu, S Shi
arXiv preprint arXiv:1909.00326, 2019
482019
Systemet kan inte utföra åtgärden just nu. Försök igen senare.
Artiklar 1–20