A survey of controllable text generation using transformer-based pre-trained language models

H Zhang, H Song, S Li, M Zhou, D Song - ACM Computing Surveys, 2023 - dl.acm.org
Controllable Text Generation (CTG) is an emerging area in the field of natural language
generation (NLG). It is regarded as crucial for the development of advanced text generation …

Non-autoregressive machine translation with disentangled context transformer

J Kasai, J Cross, M Ghazvininejad… - … conference on machine …, 2020 - proceedings.mlr.press
State-of-the-art neural machine translation models generate a translation from left to right
and every step is conditioned on the previously generated tokens. The sequential nature of …

Improving non-autoregressive translation models without distillation

XS Huang, F Perez, M Volkovs - International Conference on …, 2022 - openreview.net
Transformer-based autoregressive (AR) machine translation models have achieved
significant performance improvements, nearing human-level accuracy on some languages …

Guiding non-autoregressive neural machine translation decoding with reordering information

Q Ran, Y Lin, P Li, J Zhou - Proceedings of the AAAI Conference on …, 2021 - ojs.aaai.org
Non-autoregressive neural machine translation (NAT) generates each target word in parallel
and has achieved promising inference acceleration. However, existing NAT models still …

Transformers go for the LOLs: Generating (humourous) titles from scientific abstracts end-to-end

Y Chen, S Eger - arxiv preprint arxiv:2212.10522, 2022 - arxiv.org
We consider the end-to-end abstract-to-title generation problem, exploring seven recent
transformer based models (including ChatGPT) fine-tuned on more than 30k abstract-title …