Survey of hallucination in natural language generation

Z Ji, N Lee, R Frieske, T Yu, D Su, Y Xu, E Ishii… - ACM Computing …, 2023 - dl.acm.org
Natural Language Generation (NLG) has improved exponentially in recent years thanks to
the development of sequence-to-sequence deep learning technologies such as Transformer …

Recent advances in deep learning based dialogue systems: A systematic survey

J Ni, T Young, V Pandelea, F Xue… - Artificial intelligence review, 2023 - Springer
Dialogue systems are a popular natural language processing (NLP) task as it is promising in
real-life applications. It is also a complicated task since many NLP tasks deserving study are …

Progprompt: Generating situated robot task plans using large language models

I Singh, V Blukis, A Mousavian, A Goyal… - … on Robotics and …, 2023 - ieeexplore.ieee.org
Task planning can require defining myriad domain knowledge about the world in which a
robot needs to act. To ameliorate that effort, large language models (LLMs) can be used to …

Evidence of a predictive coding hierarchy in the human brain listening to speech

C Caucheteux, A Gramfort, JR King - Nature human behaviour, 2023 - nature.com
Considerable progress has recently been made in natural language processing: deep
learning algorithms are increasingly able to generate, summarize, translate and classify …

A survey of data augmentation approaches for NLP

SY Feng, V Gangal, J Wei, S Chandar… - arxiv preprint arxiv …, 2021 - arxiv.org
Data augmentation has recently seen increased interest in NLP due to more work in low-
resource domains, new tasks, and the popularity of large-scale neural networks that require …

Factuality enhanced language models for open-ended text generation

N Lee, W **, P Xu, M Patwary… - Advances in …, 2022 - proceedings.neurips.cc
Pretrained language models (LMs) are susceptible to generate text with nonfactual
information. In this work, we measure and improve the factual accuracy of large-scale LMs …

A contrastive framework for neural text generation

Y Su, T Lan, Y Wang, D Yogatama… - Advances in Neural …, 2022 - proceedings.neurips.cc
Text generation is of great importance to many natural language processing applications.
However, maximization-based decoding methods (eg, beam search) of neural language …

Big bird: Transformers for longer sequences

M Zaheer, G Guruganesh, KA Dubey… - Advances in neural …, 2020 - proceedings.neurips.cc
Transformers-based models, such as BERT, have been one of the most successful deep
learning models for NLP. Unfortunately, one of their core limitations is the quadratic …

BLEURT: Learning robust metrics for text generation

T Sellam, D Das, AP Parikh - arxiv preprint arxiv:2004.04696, 2020 - arxiv.org
Text generation has made significant advances in the last few years. Yet, evaluation metrics
have lagged behind, as the most popular choices (eg, BLEU and ROUGE) may correlate …

On faithfulness and factuality in abstractive summarization

J Maynez, S Narayan, B Bohnet… - arxiv preprint arxiv …, 2020 - arxiv.org
It is well known that the standard likelihood training and approximate decoding objectives in
neural text generation models lead to less human-like responses for open-ended tasks such …