Survey of hallucination in natural language generation

Z Ji, N Lee, R Frieske, T Yu, D Su, Y Xu, E Ishii… - ACM Computing …, 2023 - dl.acm.org
Natural Language Generation (NLG) has improved exponentially in recent years thanks to
the development of sequence-to-sequence deep learning technologies such as Transformer …

Faithfulness in natural language generation: A systematic survey of analysis, evaluation and optimization methods

W Li, W Wu, M Chen, J Liu, X **ao, H Wu - arxiv preprint arxiv:2203.05227, 2022 - arxiv.org
Natural Language Generation (NLG) has made great progress in recent years due to the
development of deep learning techniques such as pre-trained language models. This …

Towards improving faithfulness in abstractive summarization

X Chen, M Li, X Gao, X Zhang - Advances in Neural …, 2022 - proceedings.neurips.cc
Despite the success achieved in neural abstractive summarization based on pre-trained
language models, one unresolved issue is that the generated summaries are not always …

Toward human-like evaluation for natural language generation with error analysis

Q Lu, L Ding, L **e, K Zhang, DF Wong… - arxiv preprint arxiv …, 2022 - arxiv.org
The state-of-the-art language model-based automatic metrics, eg BARTScore, benefiting
from large-scale contextualized pre-training, have been successfully used in a wide range of …

Prevent the language model from being overconfident in neural machine translation

M Miao, F Meng, Y Liu, XH Zhou, J Zhou - arxiv preprint arxiv:2105.11098, 2021 - arxiv.org
The Neural Machine Translation (NMT) model is essentially a joint language model
conditioned on both the source sentence and partial translation. Therefore, the NMT model …

Attention calibration for transformer in neural machine translation

Y Lu, J Zeng, J Zhang, S Wu, M Li - … of the 59th Annual Meeting of …, 2021 - aclanthology.org
Attention mechanisms have achieved substantial improvements in neural machine
translation by dynamically selecting relevant inputs for different predictions. However, recent …

Turning fixed to adaptive: Integrating post-evaluation into simultaneous machine translation

S Guo, S Zhang, Y Feng - arxiv preprint arxiv:2210.11900, 2022 - arxiv.org
Simultaneous machine translation (SiMT) starts its translation before reading the whole
source sentence and employs either fixed or adaptive policy to generate the target sentence …

Bilingual attention based neural machine translation

L Kang, S He, M Wang, F Long, J Su - Applied Intelligence, 2023 - Springer
Abstract In recent years, Recurrent Neural Network based Neural Machine Translation (RNN-
based NMT) equipped with an attention mechanism from the decoder to encoder, has …

Editing common sense in transformers

A Gupta, D Mondal, AK Sheshadri, W Zhao… - arxiv preprint arxiv …, 2023 - arxiv.org
Editing model parameters directly in Transformers makes updating open-source transformer-
based models possible without re-training (Meng et al., 2023). However, these editing …

Anah-v2: Scaling analytical hallucination annotation of large language models

Y Gu, Z Ji, W Zhang, C Lyu, D Lin, K Chen - arxiv preprint arxiv …, 2024 - arxiv.org
Large language models (LLMs) exhibit hallucinations in long-form question-answering tasks
across various domains and wide applications. Current hallucination detection and …