Survey of hallucination in natural language generation
Natural Language Generation (NLG) has improved exponentially in recent years thanks to
the development of sequence-to-sequence deep learning technologies such as Transformer …
the development of sequence-to-sequence deep learning technologies such as Transformer …
Faithfulness in natural language generation: A systematic survey of analysis, evaluation and optimization methods
Natural Language Generation (NLG) has made great progress in recent years due to the
development of deep learning techniques such as pre-trained language models. This …
development of deep learning techniques such as pre-trained language models. This …
Towards improving faithfulness in abstractive summarization
Despite the success achieved in neural abstractive summarization based on pre-trained
language models, one unresolved issue is that the generated summaries are not always …
language models, one unresolved issue is that the generated summaries are not always …
Toward human-like evaluation for natural language generation with error analysis
The state-of-the-art language model-based automatic metrics, eg BARTScore, benefiting
from large-scale contextualized pre-training, have been successfully used in a wide range of …
from large-scale contextualized pre-training, have been successfully used in a wide range of …
Prevent the language model from being overconfident in neural machine translation
The Neural Machine Translation (NMT) model is essentially a joint language model
conditioned on both the source sentence and partial translation. Therefore, the NMT model …
conditioned on both the source sentence and partial translation. Therefore, the NMT model …
Attention calibration for transformer in neural machine translation
Attention mechanisms have achieved substantial improvements in neural machine
translation by dynamically selecting relevant inputs for different predictions. However, recent …
translation by dynamically selecting relevant inputs for different predictions. However, recent …
Turning fixed to adaptive: Integrating post-evaluation into simultaneous machine translation
Simultaneous machine translation (SiMT) starts its translation before reading the whole
source sentence and employs either fixed or adaptive policy to generate the target sentence …
source sentence and employs either fixed or adaptive policy to generate the target sentence …
Bilingual attention based neural machine translation
Abstract In recent years, Recurrent Neural Network based Neural Machine Translation (RNN-
based NMT) equipped with an attention mechanism from the decoder to encoder, has …
based NMT) equipped with an attention mechanism from the decoder to encoder, has …
Editing common sense in transformers
Editing model parameters directly in Transformers makes updating open-source transformer-
based models possible without re-training (Meng et al., 2023). However, these editing …
based models possible without re-training (Meng et al., 2023). However, these editing …
Anah-v2: Scaling analytical hallucination annotation of large language models
Large language models (LLMs) exhibit hallucinations in long-form question-answering tasks
across various domains and wide applications. Current hallucination detection and …
across various domains and wide applications. Current hallucination detection and …