MUSS: Multilingual unsupervised sentence simplification by mining paraphrases
Progress in sentence simplification has been hindered by a lack of labeled parallel
simplification data, particularly in languages other than English. We introduce MUSS, a …
simplification data, particularly in languages other than English. We introduce MUSS, a …
Quality controlled paraphrase generation
Paraphrase generation has been widely used in various downstream tasks. Most tasks
benefit mainly from high quality paraphrases, namely those that are semantically similar to …
benefit mainly from high quality paraphrases, namely those that are semantically similar to …
Unsupervised text generation by learning from search
In this work, we propose TGLS, a novel framework for unsupervised Text Generation by
Learning from Search. We start by applying a strong search algorithm (in particular …
Learning from Search. We start by applying a strong search algorithm (in particular …
Generating sequences by learning to self-correct
Sequence generation applications require satisfying semantic constraints, such as ensuring
that programs are correct, using certain keywords, or avoiding undesirable content …
that programs are correct, using certain keywords, or avoiding undesirable content …
Iterative edit-based unsupervised sentence simplification
We present a novel iterative, edit-based approach to unsupervised sentence simplification.
Our model is guided by a scoring function involving fluency, simplicity, and meaning …
Our model is guided by a scoring function involving fluency, simplicity, and meaning …
On the evaluation metrics for paraphrase generation
In this paper we revisit automatic metrics for paraphrase evaluation and obtain two findings
that disobey conventional wisdom:(1) Reference-free metrics achieve better performance …
that disobey conventional wisdom:(1) Reference-free metrics achieve better performance …
Novelty controlled paraphrase generation with retrieval augmented conditional prompt tuning
Paraphrase generation is a fundamental and long-standing task in natural language
processing. In this paper, we concentrate on two contributions to the task:(1) we propose …
processing. In this paper, we concentrate on two contributions to the task:(1) we propose …
Teacher forcing recovers reward functions for text generation
Reinforcement learning (RL) has been widely used in text generation to alleviate the
exposure bias issue or to utilize non-parallel datasets. The reward function plays an …
exposure bias issue or to utilize non-parallel datasets. The reward function plays an …
Prompt-based editing for text style transfer
Prompting approaches have been recently explored in text style transfer, where a textual
prompt is used to query a pretrained language model to generate style-transferred texts …
prompt is used to query a pretrained language model to generate style-transferred texts …
Gradient-guided unsupervised lexically constrained text generation
L Sha - Proceedings of the 2020 Conference on Empirical …, 2020 - aclanthology.org
Lexically constrained generation requires the target sentence to satisfy some lexical
constraints, such as containing some specific words or being the paraphrase to a given …
constraints, such as containing some specific words or being the paraphrase to a given …