[PDF][PDF] Is ChatGPT a good translator? A preliminary study

W Jiao, W Wang, J Huang, X Wang… - arxiv preprint arxiv …, 2023 - wxjiao.github.io
This report provides a preliminary evaluation of ChatGPT for machine translation, including
translation prompt, multilingual translation, and translation robustness. We adopt the …

Is ChatGPT a good translator? Yes with GPT-4 as the engine

W Jiao, W Wang, J Huang, X Wang, S Shi… - arxiv preprint arxiv …, 2023 - arxiv.org
This report provides a preliminary evaluation of ChatGPT for machine translation, including
translation prompt, multilingual translation, and translation robustness. We adopt the …

Transformer: A general framework from machine translation to others

Y Zhao, J Zhang, C Zong - Machine Intelligence Research, 2023 - Springer
Abstract Machine translation is an important and challenging task that aims at automatically
translating natural language sentences from one language into another. Recently …

Exploring human-like translation strategy with large language models

Z He, T Liang, W Jiao, Z Zhang, Y Yang… - Transactions of the …, 2024 - direct.mit.edu
Large language models (LLMs) have demonstrated impressive capabilities in general
scenarios, exhibiting a level of aptitude that approaches, in some aspects even surpasses …

Chatgpt or grammarly? evaluating chatgpt on grammatical error correction benchmark

H Wu, W Wang, Y Wan, W Jiao, M Lyu - arxiv preprint arxiv:2303.13648, 2023 - arxiv.org
ChatGPT is a cutting-edge artificial intelligence language model developed by OpenAI,
which has attracted a lot of attention due to its surprisingly strong ability in answering follow …

Improving the transferability of adversarial samples by path-augmented method

J Zhang, J Huang, W Wang, Y Li… - Proceedings of the …, 2023 - openaccess.thecvf.com
Deep neural networks have achieved unprecedented success on diverse vision tasks.
However, they are vulnerable to adversarial noise that is imperceptible to humans. This …

Transferable adversarial attacks on vision transformers with token gradient regularization

J Zhang, Y Huang, W Wu… - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
Vision transformers (ViTs) have been successfully deployed in a variety of computer vision
tasks, but they are still vulnerable to adversarial samples. Transfer-based attacks use a local …

Token-level self-evolution training for sequence-to-sequence learning

K Peng, L Ding, Q Zhong, Y Ouyang… - Proceedings of the …, 2023 - aclanthology.org
Adaptive training approaches, widely used in sequence-to-sequence models, commonly
reweigh the losses of different target tokens based on priors, eg word frequency. However …

Redistributing low-frequency words: Making the most of monolingual data in non-autoregressive translation

L Ding, L Wang, S Shi, D Tao, Z Tu - … of the 60th Annual Meeting of …, 2022 - aclanthology.org
Abstract Knowledge distillation (KD) is the preliminary step for training non-autoregressive
translation (NAT) models, which eases the training of NAT models at the cost of losing …

On the complementarity between pre-training and random-initialization for resource-rich machine translation

C Zan, L Ding, L Shen, Y Cao, W Liu, D Tao - arxiv preprint arxiv …, 2022 - arxiv.org
Pre-Training (PT) of text representations has been successfully applied to low-resource
Neural Machine Translation (NMT). However, it usually fails to achieve notable gains …