Compound probabilistic context-free grammars for grammar induction

Y Kim, C Dyer, AM Rush - arxiv preprint arxiv:1906.10225, 2019 - arxiv.org
We study a formalization of the grammar induction problem that models sentences as being
generated by a compound probabilistic context-free grammar. In contrast to traditional …

Controllable paraphrase generation with a syntactic exemplar

M Chen, Q Tang, S Wiseman, K Gimpel - arxiv preprint arxiv:1906.00565, 2019 - arxiv.org
Prior work on controllable text generation usually assumes that the controlled attribute can
take on one of a small set of values known a priori. In this work, we propose a novel task …

Neural syntactic preordering for controlled paraphrase generation

T Goyal, G Durrett - arxiv preprint arxiv:2005.02013, 2020 - arxiv.org
Paraphrasing natural language sentences is a multifaceted process: it might involve
replacing individual words or short phrases, local rearrangement of content, or high-level …

Syntax-guided controlled generation of paraphrases

A Kumar, K Ahuja, R Vadapalli… - Transactions of the …, 2020 - direct.mit.edu
Given a sentence (eg,“I like mangoes”) and a constraint (eg, sentiment flip), the goal of
controlled text generation is to produce a sentence that adapts the input sentence to meet …

Generating syntactically controlled paraphrases without using annotated parallel pairs

KH Huang, KW Chang - arxiv preprint arxiv:2101.10579, 2021 - arxiv.org
Paraphrase generation plays an essential role in natural language process (NLP), and it has
many downstream applications. However, training supervised paraphrase models requires …

AESOP: Paraphrase generation with adaptive syntactic control

J Sun, X Ma, N Peng - Proceedings of the 2021 conference on …, 2021 - aclanthology.org
We propose to control paraphrase generation through carefully chosen target syntactic
structures to generate more proper and higher quality paraphrases. Our model, AESOP …

Evaluating large language models on controlled generation tasks

J Sun, Y Tian, W Zhou, N Xu, Q Hu, R Gupta… - arxiv preprint arxiv …, 2023 - arxiv.org
While recent studies have looked into the abilities of large language models in various
benchmark tasks, including question generation, reading comprehension, multilingual and …

Hierarchical sketch induction for paraphrase generation

T Hosking, H Tang, M Lapata - arxiv preprint arxiv:2203.03463, 2022 - arxiv.org
We propose a generative model of paraphrase generation, that encourages syntactic
diversity by conditioning on an explicit syntactic sketch. We introduce Hierarchical …

PAIR: Planning and iterative refinement in pre-trained transformers for long text generation

X Hua, L Wang - arxiv preprint arxiv:2010.02301, 2020 - arxiv.org
Pre-trained Transformers have enabled impressive breakthroughs in generating long and
fluent text, yet their outputs are often" rambling" without coherently arranged content. In this …

Are personalized stochastic parrots more dangerous? evaluating persona biases in dialogue systems

Y Wan, J Zhao, A Chadha, N Peng… - arxiv preprint arxiv …, 2023 - arxiv.org
Recent advancements in Large Language Models empower them to follow freeform
instructions, including imitating generic or specific demographic personas in conversations …