Compound probabilistic context-free grammars for grammar induction
We study a formalization of the grammar induction problem that models sentences as being
generated by a compound probabilistic context-free grammar. In contrast to traditional …
generated by a compound probabilistic context-free grammar. In contrast to traditional …
Controllable paraphrase generation with a syntactic exemplar
Prior work on controllable text generation usually assumes that the controlled attribute can
take on one of a small set of values known a priori. In this work, we propose a novel task …
take on one of a small set of values known a priori. In this work, we propose a novel task …
Neural syntactic preordering for controlled paraphrase generation
Paraphrasing natural language sentences is a multifaceted process: it might involve
replacing individual words or short phrases, local rearrangement of content, or high-level …
replacing individual words or short phrases, local rearrangement of content, or high-level …
Syntax-guided controlled generation of paraphrases
Given a sentence (eg,“I like mangoes”) and a constraint (eg, sentiment flip), the goal of
controlled text generation is to produce a sentence that adapts the input sentence to meet …
controlled text generation is to produce a sentence that adapts the input sentence to meet …
Generating syntactically controlled paraphrases without using annotated parallel pairs
Paraphrase generation plays an essential role in natural language process (NLP), and it has
many downstream applications. However, training supervised paraphrase models requires …
many downstream applications. However, training supervised paraphrase models requires …
AESOP: Paraphrase generation with adaptive syntactic control
We propose to control paraphrase generation through carefully chosen target syntactic
structures to generate more proper and higher quality paraphrases. Our model, AESOP …
structures to generate more proper and higher quality paraphrases. Our model, AESOP …
Evaluating large language models on controlled generation tasks
While recent studies have looked into the abilities of large language models in various
benchmark tasks, including question generation, reading comprehension, multilingual and …
benchmark tasks, including question generation, reading comprehension, multilingual and …
Hierarchical sketch induction for paraphrase generation
We propose a generative model of paraphrase generation, that encourages syntactic
diversity by conditioning on an explicit syntactic sketch. We introduce Hierarchical …
diversity by conditioning on an explicit syntactic sketch. We introduce Hierarchical …
PAIR: Planning and iterative refinement in pre-trained transformers for long text generation
Pre-trained Transformers have enabled impressive breakthroughs in generating long and
fluent text, yet their outputs are often" rambling" without coherently arranged content. In this …
fluent text, yet their outputs are often" rambling" without coherently arranged content. In this …
Are personalized stochastic parrots more dangerous? evaluating persona biases in dialogue systems
Recent advancements in Large Language Models empower them to follow freeform
instructions, including imitating generic or specific demographic personas in conversations …
instructions, including imitating generic or specific demographic personas in conversations …