A survey of knowledge-enhanced text generation

W Yu, C Zhu, Z Li, Z Hu, Q Wang, H Ji… - ACM Computing …, 2022 - dl.acm.org
The goal of text-to-text generation is to make machines express like a human in many
applications such as conversation, summarization, and translation. It is one of the most …

Paraphrase generation: A survey of the state of the art

J Zhou, S Bhat - Proceedings of the 2021 conference on empirical …, 2021 - aclanthology.org
This paper focuses on paraphrase generation, which is a widely studied natural language
generation task in NLP. With the development of neural models, paraphrase generation …

Paraphrasing evades detectors of ai-generated text, but retrieval is an effective defense

K Krishna, Y Song, M Karpinska… - Advances in Neural …, 2023 - proceedings.neurips.cc
The rise in malicious usage of large language models, such as fake content creation and
academic plagiarism, has motivated the development of approaches that identify AI …

A survey on retrieval-augmented text generation

H Li, Y Su, D Cai, Y Wang, L Liu - arxiv preprint arxiv:2202.01110, 2022 - arxiv.org
Recently, retrieval-augmented text generation attracted increasing attention of the
computational linguistics community. Compared with conventional generation models …

Matching structure for dual learning

H Fei, S Wu, Y Ren, M Zhang - international conference on …, 2022 - proceedings.mlr.press
Many natural language processing (NLP) tasks appear in dual forms, which are generally
solved by dual learning technique that models the dualities between the coupled tasks. In …

An empirical survey of data augmentation for limited data learning in NLP

J Chen, D Tam, C Raffel, M Bansal… - Transactions of the …, 2023 - direct.mit.edu
NLP has achieved great progress in the past decade through the use of neural models and
large labeled datasets. The dependence on abundant data prevents NLP models from being …

Reformulating unsupervised style transfer as paraphrase generation

K Krishna, J Wieting, M Iyyer - arxiv preprint arxiv:2010.05700, 2020 - arxiv.org
Modern NLP defines the task of style transfer as modifying the style of a given sentence
without appreciably changing its semantics, which implies that the outputs of style transfer …

Compound probabilistic context-free grammars for grammar induction

Y Kim, C Dyer, AM Rush - arxiv preprint arxiv:1906.10225, 2019 - arxiv.org
We study a formalization of the grammar induction problem that models sentences as being
generated by a compound probabilistic context-free grammar. In contrast to traditional …

Chatgpt to replace crowdsourcing of paraphrases for intent classification: Higher diversity and comparable model robustness

J Cegin, J Simko, P Brusilovsky - arxiv preprint arxiv:2305.12947, 2023 - arxiv.org
The emergence of generative large language models (LLMs) raises the question: what will
be its impact on crowdsourcing? Traditionally, crowdsourcing has been used for acquiring …

Tailor: Generating and perturbing text with semantic controls

A Ross, T Wu, H Peng, ME Peters… - arxiv preprint arxiv …, 2021 - arxiv.org
Controlled text perturbation is useful for evaluating and improving model generalizability.
However, current techniques rely on training a model for every target perturbation, which is …