Repairing the cracked foundation: A survey of obstacles in evaluation practices for generated text

S Gehrmann, E Clark, T Sellam - Journal of Artificial Intelligence Research, 2023‏ - jair.org
Abstract Evaluation practices in natural language generation (NLG) have many known flaws,
but improved evaluation approaches are rarely widely adopted. This issue has become …

Document-level machine translation with large language models

L Wang, C Lyu, T Ji, Z Zhang, D Yu, S Shi… - arxiv preprint arxiv …, 2023‏ - arxiv.org
Large language models (LLMs) such as ChatGPT can produce coherent, cohesive, relevant,
and fluent answers for various natural language processing (NLP) tasks. Taking document …

P-transformer: Towards better document-to-document neural machine translation

Y Li, J Li, J Jiang, S Tao, H Yang… - IEEE/ACM Transactions …, 2023‏ - ieeexplore.ieee.org
Directly training a document-to-document (Doc2Doc) neural machine translation (NMT) via
Transformer from scratch, especially on small datasets, usually fails to converge. Our …

Delta: An online document-level translation agent based on multi-level memory

Y Wang, J Zeng, X Liu, DF Wong, F Meng… - arxiv preprint arxiv …, 2024‏ - arxiv.org
Large language models (LLMs) have achieved reasonable quality improvements in
machine translation (MT). However, most current research on MT-LLMs still faces significant …

Enhancing document-level translation of large language model via translation mixed-instructions

Y Li, J Li, J Jiang, M Zhang - arxiv preprint arxiv:2401.08088, 2024‏ - arxiv.org
Existing large language models (LLMs) for machine translation are typically fine-tuned on
sentence-level translation instructions and achieve satisfactory performance at the sentence …

DeMPT: Decoding-enhanced Multi-phase Prompt Tuning for Making LLMs Be Better Context-aware Translators

X Lyu, J Li, Y Zhao, M Zhang, D Wei, S Tao… - arxiv preprint arxiv …, 2024‏ - arxiv.org
Generally, the decoder-only large language models (LLMs) are adapted to context-aware
neural machine translation (NMT) in a concatenating way, where LLMs take the …

Modeling consistency preference via lexical chains for document-level neural machine translation

X Lyu, J Li, S Tao, H Yang, Y Qin… - Proceedings of the 2022 …, 2022‏ - aclanthology.org
In this paper we aim to relieve the issue of lexical translation inconsistency for document-
level neural machine translation (NMT) by modeling consistency preference for lexical …

CoDoNMT: Modeling cohesion devices for document-level neural machine translation

Y Lei, Y Ren, D **ong - … of the 29th International Conference on …, 2022‏ - aclanthology.org
Cohesion devices, eg, reiteration, coreference, are crucial for building cohesion links across
sentences. In this paper, we propose a document-level neural machine translation …

Refining History for Future-Aware Neural Machine Translation

X Lyu, J Li, M Zhang, C Ding, H Tanaka… - … /ACM Transactions on …, 2022‏ - ieeexplore.ieee.org
Neural machine translation uses a decoder to generate target words auto-regressively by
predicting the next target word conditioned on a given source sentence and its previously …

Lexical Translation Inconsistency-Aware Document-Level Translation Repair

Z Zhang, J Li, S Tao, H Yang - Findings of the Association for …, 2023‏ - aclanthology.org
Following the idea of “one translation per discourse”, in this paper we aim to improve
translation consistency via document-level translation repair (DocRepair), ie, automatic post …