Exploring human-like translation strategy with large language models

Z He, T Liang, W Jiao, Z Zhang, Y Yang… - Transactions of the …, 2024 - direct.mit.edu
Large language models (LLMs) have demonstrated impressive capabilities in general
scenarios, exhibiting a level of aptitude that approaches, in some aspects even surpasses …

[PDF][PDF] New trends in machine translation using large language models: Case examples with chatgpt

C Lyu, J Xu, L Wang - arxiv preprint arxiv:2305.01181, 2023 - longyuewang.com
Abstract Machine Translation (MT) has made significant progress in recent years using deep
learning, especially after the emergence of large language models (LLMs) such as GPT-3 …

Can watermarks survive translation? on the cross-lingual consistency of text watermark for large language models

Z He, B Zhou, H Hao, A Liu, X Wang, Z Tu… - arxiv preprint arxiv …, 2024 - arxiv.org
Text watermarking technology aims to tag and identify content produced by large language
models (LLMs) to prevent misuse. In this study, we introduce the concept of cross-lingual …

Improving machine translation with human feedback: An exploration of quality estimation as a reward model

Z He, X Wang, W Jiao, Z Zhang, R Wang, S Shi… - arxiv preprint arxiv …, 2024 - arxiv.org
Insufficient modeling of human preferences within the reward model is a major obstacle for
leveraging human feedback to improve translation quality. Fortunately, quality estimation …

VisTFC: Vision-guided target-side future context learning for neural machine translation

S Zhu, S Li, D **ong - Expert Systems with Applications, 2024 - Elsevier
Visual features encompass visual information extracted from images or videos, serving as
supplementary input to enhance the efficacy of neural machine translation (NMT) systems …

Unsupervised multilingual machine translation with pretrained cross-lingual encoders

Y Shen, W Bao, G Gao, M Zhou, X Zhao - Knowledge-Based Systems, 2024 - Elsevier
Abstract Multilingual Neural Machine Translation (MNMT) has recently made great progress
in training models that can translate between multiple languages. However, MNMT faces a …

A theory of unsupervised translation motivated by understanding animal communication

S Goldwasser, D Gruber, AT Kalai… - Advances in Neural …, 2023 - proceedings.neurips.cc
Neural networks are capable of translating between languages—in some cases even
between two languages where there is little or no access to parallel translations, in what is …

Monolingual denoising with large language models for low-resource machine translation

H Xu, X Wang, X **ng, Y Hong - CCF International Conference on Natural …, 2023 - Springer
Low-resource machine translation struggles over the issue of bilingual data sparsity. Self-
training based bilingual data augmentation is potentially useful for overcoming the issue …

A note on bias to complete

J Xu, M Diab - arxiv preprint arxiv:2402.11710, 2024 - arxiv.org
Minimizing social bias strengthens societal bonds, promoting shared understanding and
better decision-making. We revisit the definition of bias by discovering new bias types (eg …

Unsupervised Machine Translation Based on Dynamic Adaptive Masking Strategy and Multi-Task Learning

C Zhang, D Qu, L Du, K Yang - … of the International Conference on Image …, 2024 - dl.acm.org
This study proposes an unsupervised machine translation method based on a dynamic
adaptive masking strategy and multi-task learning. Firstly, a dynamic adaptive masking …