A contrastive framework for neural text generation
Text generation is of great importance to many natural language processing applications.
However, maximization-based decoding methods (eg, beam search) of neural language …
However, maximization-based decoding methods (eg, beam search) of neural language …
Multi-task pre-training for plug-and-play task-oriented dialogue system
Pre-trained language models have been recently shown to benefit task-oriented dialogue
(TOD) systems. Despite their success, existing methods often formulate this task as a …
(TOD) systems. Despite their success, existing methods often formulate this task as a …
A survey on retrieval-augmented text generation
Recently, retrieval-augmented text generation attracted increasing attention of the
computational linguistics community. Compared with conventional generation models …
computational linguistics community. Compared with conventional generation models …
Language models can see: Plugging visual controls in text generation
Generative language models (LMs) such as GPT-2/3 can be prompted to generate text with
remarkable quality. While they are designed for text-prompted generation, it remains an …
remarkable quality. While they are designed for text-prompted generation, it remains an …
A survey on neural data-to-text generation
Data-to-text Generation (D2T) aims to generate textual natural language statements that can
fluently and precisely describe the structured data such as graphs, tables, and meaning …
fluently and precisely describe the structured data such as graphs, tables, and meaning …
Plan-then-generate: Controlled data-to-text generation via planning
Recent developments in neural networks have led to the advance in data-to-text generation.
However, the lack of ability of neural models to control the structure of generated output can …
However, the lack of ability of neural models to control the structure of generated output can …
Retrieving multimodal information for augmented generation: A survey
As Large Language Models (LLMs) become popular, there emerged an important trend of
using multimodality to augment the LLMs' generation ability, which enables LLMs to better …
using multimodality to augment the LLMs' generation ability, which enables LLMs to better …
TaCL: Improving BERT pre-training with token-aware contrastive learning
Masked language models (MLMs) such as BERT and RoBERTa have revolutionized the
field of Natural Language Understanding in the past few years. However, existing pre …
field of Natural Language Understanding in the past few years. However, existing pre …
Bioreader: a retrieval-enhanced text-to-text transformer for biomedical literature
The latest batch of research has equipped language models with the ability to attend over
relevant and factual information from non-parametric external sources, drawing a …
relevant and factual information from non-parametric external sources, drawing a …
Neural pipeline for zero-shot data-to-text generation
In data-to-text (D2T) generation, training on in-domain data leads to overfitting to the data
representation and repeating training data noise. We examine how to avoid finetuning …
representation and repeating training data noise. We examine how to avoid finetuning …