Survey of hallucination in natural language generation
Natural Language Generation (NLG) has improved exponentially in recent years thanks to
the development of sequence-to-sequence deep learning technologies such as Transformer …
the development of sequence-to-sequence deep learning technologies such as Transformer …
Recent advances in deep learning based dialogue systems: A systematic survey
Dialogue systems are a popular natural language processing (NLP) task as it is promising in
real-life applications. It is also a complicated task since many NLP tasks deserving study are …
real-life applications. It is also a complicated task since many NLP tasks deserving study are …
Progprompt: Generating situated robot task plans using large language models
Task planning can require defining myriad domain knowledge about the world in which a
robot needs to act. To ameliorate that effort, large language models (LLMs) can be used to …
robot needs to act. To ameliorate that effort, large language models (LLMs) can be used to …
Evidence of a predictive coding hierarchy in the human brain listening to speech
Considerable progress has recently been made in natural language processing: deep
learning algorithms are increasingly able to generate, summarize, translate and classify …
learning algorithms are increasingly able to generate, summarize, translate and classify …
A survey of data augmentation approaches for NLP
Data augmentation has recently seen increased interest in NLP due to more work in low-
resource domains, new tasks, and the popularity of large-scale neural networks that require …
resource domains, new tasks, and the popularity of large-scale neural networks that require …
Factuality enhanced language models for open-ended text generation
Pretrained language models (LMs) are susceptible to generate text with nonfactual
information. In this work, we measure and improve the factual accuracy of large-scale LMs …
information. In this work, we measure and improve the factual accuracy of large-scale LMs …
A contrastive framework for neural text generation
Text generation is of great importance to many natural language processing applications.
However, maximization-based decoding methods (eg, beam search) of neural language …
However, maximization-based decoding methods (eg, beam search) of neural language …
Big bird: Transformers for longer sequences
Transformers-based models, such as BERT, have been one of the most successful deep
learning models for NLP. Unfortunately, one of their core limitations is the quadratic …
learning models for NLP. Unfortunately, one of their core limitations is the quadratic …
BLEURT: Learning robust metrics for text generation
Text generation has made significant advances in the last few years. Yet, evaluation metrics
have lagged behind, as the most popular choices (eg, BLEU and ROUGE) may correlate …
have lagged behind, as the most popular choices (eg, BLEU and ROUGE) may correlate …
On faithfulness and factuality in abstractive summarization
It is well known that the standard likelihood training and approximate decoding objectives in
neural text generation models lead to less human-like responses for open-ended tasks such …
neural text generation models lead to less human-like responses for open-ended tasks such …