Segui
Maha Elbayad
Maha Elbayad
Research scientist, Meta AI
Email verificata su fb.com - Home page
Titolo
Citata da
Citata da
Anno
No language left behind: Scaling human-centered machine translation
NLLB Team, MR Costa-jussà, J Cross, O Çelebi, M Elbayad, K Heafield, ...
arXiv preprint arXiv:2207.04672, 2022
785*2022
Depth-adaptive transformer
M Elbayad, J Gu, E Grave, M Auli
arXiv preprint arXiv:1910.10073, 2019
1912019
Pervasive attention: 2D convolutional neural networks for sequence-to-sequence prediction
M Elbayad, L Besacier, J Verbeek
CoNLL 2018-Conference on Computational Natural Language Learning, 97–107, 2018
1182018
SeamlessM4T - Massively Multilingual & Multimodal Machine Translation
S Communication, L Barrault, YA Chung, MC Meglioli, D Dale, N Dong, ...
arXiv preprint arXiv:2308.11596, 2023
113*2023
Findings of the IWSLT 2022 Evaluation Campaign.
A Anastasopoulos, L Barrault, L Bentivogli, MZ Boito, O Bojar, R Cattoni, ...
Proceedings of the 19th international conference on spoken language …, 2022
1122022
Seamless: Multilingual Expressive and Streaming Speech Translation
L Barrault, YA Chung, MC Meglioli, D Dale, N Dong, M Duppenthaler, ...
arXiv preprint arXiv:2312.05187, 2023
1092023
No language left behind: Scaling human-centered machine translation
N Team, MR Costa-Jussà, J Cross, O Çelebi, M Elbayad, K Heafield, ...
arXiv preprint arXiv:2207.04672, 2022
812022
Efficient wait-k models for simultaneous machine translation
M Elbayad, L Besacier, J Verbeek
arXiv preprint arXiv:2005.08595, 2020
712020
Scaling neural machine translation to 200 languages
Nature 630 (8018), 841-846, 2024
322024
SpiRit-LM: Interleaved Spoken and Written Language Model
TA Nguyen, B Muller, B Yu, MR Costa-Jussa, M Elbayad, S Popuri, ...
Transactions of the Association for Computational Linguistics 13, 30-52, 2025
282025
Token-level and sequence-level loss smoothing for RNN language models
M Elbayad, L Besacier, J Verbeek
arXiv preprint arXiv:1805.05062, 2018
242018
Causes and cures for interference in multilingual translation
U Shaham, M Elbayad, V Goswami, O Levy, S Bhosale
arXiv preprint arXiv:2212.07530, 2022
212022
Online versus offline NMT quality: An in-depth analysis on English-German and German-English
M Elbayad, M Ustaszewski, E Esperança-Rodier, FB Manquat, J Verbeek, ...
arXiv preprint arXiv:2006.00814, 2020
132020
On-trac consortium for end-to-end and simultaneous speech translation challenge tasks at iwslt 2020
M Elbayad, H Nguyen, F Bougares, N Tomashenko, A Caubrière, ...
arXiv preprint arXiv:2005.11861, 2020
102020
Merging text transformer models from different initializations
N Verma, M Elbayad
arXiv preprint arXiv:2403.00986, 2024
82024
Fixing MoE over-fitting on low-resource languages in multilingual machine translation
M Elbayad, A Sun, S Bhosale
arXiv preprint arXiv:2212.07571, 2022
82022
Added toxicity mitigation at inference time for multimodal and massively multilingual translation
MR Costa-jussà, D Dale, M Elbayad, B Yu
arXiv preprint arXiv:2311.06532, 2023
42023
Towards being parameter-efficient: A stratified sparsely activated transformer with dynamic capacity
H Xu, M Elbayad, K Murray, J Maillard, V Goswami
arXiv preprint arXiv:2305.02176, 2023
32023
Large Concept Models: Language Modeling in a Sentence Representation Space
LCM Team, L Barrault, PA Duquenne, M Elbayad, A Kozhevnikov, ...
arXiv preprint arXiv:2412.08821, 2024
2*2024
Efficiently upgrading multilingual machine translation models to support more languages
S Sun, M Elbayad, A Sun, J Cross
arXiv preprint arXiv:2302.03528, 2023
22023
Il sistema al momento non può eseguire l'operazione. Riprova più tardi.
Articoli 1–20