Folgen
Louis Martin
Louis Martin
Facebook A.I. Research / Inria
Bestätigte E-Mail-Adresse bei fb.com - Startseite
Titel
Zitiert von
Zitiert von
Jahr
Llama 2: Open foundation and fine-tuned chat models
H Touvron, L Martin, K Stone, P Albert, A Almahairi, Y Babaei, ...
arXiv preprint arXiv:2307.09288, 2023
121982023
The llama 3 herd of models
A Dubey, A Jauhri, A Pandey, A Kadian, A Al-Dahle, A Letman, A Mathur, ...
arXiv preprint arXiv:2407.21783, 2024
22792024
Code llama: Open foundation models for code
B Roziere, J Gehring, F Gloeckle, S Sootla, I Gat, XE Tan, Y Adi, J Liu, ...
arXiv preprint arXiv:2308.12950, 2023
16452023
CamemBERT: a Tasty French Language Model
L Martin, B Muller, PJO Suárez, Y Dupont, L Romary, ÉV de la Clergerie, ...
ACL 2020, 2020
13132020
Controllable Sentence Simplification
L Martin, B Sagot, E de la Clergerie, A Bordes
LREC 2020, 2020
1872020
Effective long-context scaling of foundation models
W Xiong, J Liu, I Molybog, H Zhang, P Bhargava, R Hou, L Martin, ...
arXiv preprint arXiv:2309.16039, 2023
1732023
EASSE: Easier Automatic Sentence Simplification Evaluation
F Alva-Manchego*, L Martin*, C Scarton, L Specia
EMNLP 2019, 2019
1642019
ASSET: A Dataset for Tuning and Evaluation of Sentence Simplification Models with Multiple Rewriting Transformations
F Alva-Manchego*, L Martin*, A Bordes, C Scarton, B Sagot, L Specia
ACL 2020, 2020
1532020
Llama 2: open foundation and fine-tuned chat models. arXiv
H Touvron, L Martin, K Stone, P Albert, A Almahairi, Y Babaei, ...
arXiv preprint arXiv:2307.09288, 2023
1512023
Llama 2: Open foundation and fine-tuned chat models, 2023b
H Touvron, L Martin, K Stone, P Albert, A Almahairi, Y Babaei, ...
URL https://arxiv. org/abs/2307.09288, 2023
1442023
Llama 2: Open foundation and fine-tuned chat models. arXiv 2023
H Touvron, L Martin, K Stone, P Albert, A Almahairi, Y Babaei, ...
arXiv preprint arXiv:2307.09288, 0
142
MUSS: Multilingual unsupervised sentence simplification by mining paraphrases
L Martin, A Fan, E De La Clergerie, A Bordes, B Sagot
arXiv preprint arXiv:2005.00352, 2020
1282020
Efficient large scale language modeling with mixtures of experts
M Artetxe, S Bhosale, N Goyal, T Mihaylov, M Ott, S Shleifer, XV Lin, J Du, ...
arXiv preprint arXiv:2112.10684, 2021
1112021
The llama 3 herd of models
A Grattafiori, A Dubey, A Jauhri, A Pandey, A Kadian, A Al-Dahle, ...
arXiv e-prints, arXiv: 2407.21783, 2024
692024
Euclid Definition Study Report, arXiv e-prints (2011)
R Laureijs, J Amiaux, S Arduini, JL Augueres, J Brinchmann, R Cole, ...
arXiv preprint arXiv:1110.3193 1110, 0
65
& Scialom, T.(2023). Llama 2: Open foundation and fine-tuned chat models
H Touvron, L Martin, K Stone, P Albert, A Almahairi, Y Babaei, ...
arXiv preprint arXiv:2307.09288, 2023
632023
Llama 2: open foundation and fine-tuned chat models. CoRR abs/2307.09288 (2023)
H Touvron, L Martin, K Stone, P Albert, A Almahairi, Y Babaei, ...
arXiv preprint arXiv:2307.09288 10, 2023
622023
Code llama: Open foundation models for code.(2023)
B Rozière, J Gehring, F Gloeckle, S Sootla, I Gat, XE Tan, Y Adi, J Liu, ...
arXiv preprint arXiv:2308.12950, 2023
562023
Reference-less Quality Estimation of Text Simplification Systems
L Martin, S Humeau, PE Mazaré, A Bordes, ÉV de La Clergerie, B Sagot
INLG 2018 - 1st Workshop on Automatic Text Adaptation (ATA), 2018
532018
Multilingual unsupervised sentence simplification
L Martin, A Fan, EV de La Clergerie, A Bordes, B Sagot
462021
Das System kann den Vorgang jetzt nicht ausführen. Versuchen Sie es später erneut.
Artikel 1–20