Discovering language model behaviors with model-written evaluations
As language models (LMs) scale, they develop many novel behaviors, good and bad,
exacerbating the need to evaluate how they behave. Prior work creates evaluations with …
exacerbating the need to evaluate how they behave. Prior work creates evaluations with …
Multilingual translation with extensible multilingual pretraining and finetuning
Recent work demonstrates the potential of multilingual pretraining of creating one model that
can be used for various tasks in different languages. Previous work in multilingual …
can be used for various tasks in different languages. Previous work in multilingual …
Contrastive learning for many-to-many multilingual neural machine translation
Existing multilingual machine translation approaches mainly focus on English-centric
directions, while the non-English directions still lag behind. In this work, we aim to build a …
directions, while the non-English directions still lag behind. In this work, we aim to build a …
Cico: Domain-aware sign language retrieval via cross-lingual contrastive learning
This work focuses on sign language retrieval--a recently proposed task for sign language
understanding. Sign language retrieval consists of two sub-tasks: text-to-sign-video (T2V) …
understanding. Sign language retrieval consists of two sub-tasks: text-to-sign-video (T2V) …
MTOP: A comprehensive multilingual task-oriented semantic parsing benchmark
Scaling semantic parsing models for task-oriented dialog systems to new languages is often
expensive and time-consuming due to the lack of available datasets. Available datasets …
expensive and time-consuming due to the lack of available datasets. Available datasets …
Facebook ai wmt21 news translation task submission
We describe Facebook's multilingual model submission to the WMT2021 shared task on
news translation. We participate in 14 language directions: English to and from Czech …
news translation. We participate in 14 language directions: English to and from Czech …
ERNIE-M: Enhanced multilingual representation by aligning cross-lingual semantics with monolingual corpora
Recent studies have demonstrated that pre-trained cross-lingual models achieve impressive
performance in downstream cross-lingual tasks. This improvement benefits from learning a …
performance in downstream cross-lingual tasks. This improvement benefits from learning a …
One question answering model for many languages with cross-lingual dense passage retrieval
Abstract We present Cross-lingual Open-Retrieval Answer Generation (CORA), the first
unified many-to-many question answering (QA) model that can answer questions across …
unified many-to-many question answering (QA) model that can answer questions across …
End-to-end speech translation via cross-modal progressive training
End-to-end speech translation models have become a new trend in research due to their
potential of reducing error propagation. However, these models still suffer from the …
potential of reducing error propagation. However, these models still suffer from the …
Zero-shot cross-lingual transfer of neural machine translation with multilingual pretrained encoders
Previous work mainly focuses on improving cross-lingual transfer for NLU tasks with a
multilingual pretrained encoder (MPE), or improving the performance on supervised …
multilingual pretrained encoder (MPE), or improving the performance on supervised …