Recent advances in natural language processing via large pre-trained language models: A survey
Large, pre-trained language models (PLMs) such as BERT and GPT have drastically
changed the Natural Language Processing (NLP) field. For numerous NLP tasks …
changed the Natural Language Processing (NLP) field. For numerous NLP tasks …
Pre-trained language models for text generation: A survey
Text Generation aims to produce plausible and readable text in human language from input
data. The resurgence of deep learning has greatly advanced this field, in particular, with the …
data. The resurgence of deep learning has greatly advanced this field, in particular, with the …
Neural machine translation for low-resource languages: A survey
Neural Machine Translation (NMT) has seen tremendous growth in the last ten years since
the early 2000s and has already entered a mature phase. While considered the most widely …
the early 2000s and has already entered a mature phase. While considered the most widely …
Multilingual speech translation with efficient finetuning of pretrained models
We present a simple yet effective approach to build multilingual speech-to-text (ST)
translation by efficient transfer learning from pretrained speech encoder and text decoder …
translation by efficient transfer learning from pretrained speech encoder and text decoder …
Lightweight adapter tuning for multilingual speech translation
Adapter modules were recently introduced as an efficient alternative to fine-tuning in NLP.
Adapter tuning consists in freezing pretrained parameters of a model and injecting …
Adapter tuning consists in freezing pretrained parameters of a model and injecting …
Efficient hierarchical domain adaptation for pretrained language models
The remarkable success of large language models has been driven by dense models
trained on massive unlabeled, unstructured corpora. These corpora typically contain text …
trained on massive unlabeled, unstructured corpora. These corpora typically contain text …
Multilingual unsupervised neural machine translation with denoising adapters
We consider the problem of multilingual unsupervised machine translation, translating to
and from languages that only have monolingual data by using auxiliary parallel language …
and from languages that only have monolingual data by using auxiliary parallel language …
MSP: Multi-stage prompting for making pre-trained language models better translators
Prompting has recently been shown as a promising approach for applying pre-trained
language models to perform downstream tasks. We present Multi-Stage Prompting (MSP), a …
language models to perform downstream tasks. We present Multi-Stage Prompting (MSP), a …
Zero-shot cross-lingual transfer of neural machine translation with multilingual pretrained encoders
Previous work mainly focuses on improving cross-lingual transfer for NLU tasks with a
multilingual pretrained encoder (MPE), or improving the performance on supervised …
multilingual pretrained encoder (MPE), or improving the performance on supervised …
Language and task arithmetic with parameter-efficient layers for zero-shot summarization
Parameter-efficient fine-tuning (PEFT) using labeled task data can significantly improve the
performance of large language models (LLMs) on the downstream task. However, there are …
performance of large language models (LLMs) on the downstream task. However, there are …