Pre-trained language models for text generation: A survey

J Li, T Tang, WX Zhao, JY Nie, JR Wen - ACM Computing Surveys, 2024 - dl.acm.org
Text Generation aims to produce plausible and readable text in human language from input
data. The resurgence of deep learning has greatly advanced this field, in particular, with the …

A survey on non-autoregressive generation for neural machine translation and beyond

Y ** cultural adaptation methods is important, which can improve the model
performance on the low-resource ones and provide more equitable opportunities for …

Continual mixed-language pre-training for extremely low-resource neural machine translation

Z Liu, GI Winata, P Fung - arxiv preprint arxiv:2105.03953, 2021 - arxiv.org
The data scarcity in low-resource languages has become a bottleneck to building robust
neural machine translation systems. Fine-tuning a multilingual pre-trained model (eg …