Följ
Lysandre Debut
Lysandre Debut
Machine Learning Engineer, Hugging Face
Verifierad e-postadress på huggingface.co
Titel
Citeras av
Citeras av
År
Transformers: State-of-the-Art Natural Language Processing
T Wolf
arXiv preprint arXiv:1910.03771, 2020
94592020
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
V Sanh, L Debut, J Chaumond, T Wolf
arXiv preprint arXiv:1910.01108, 2019
88242019
Peft: State-of-the-art parameter-efficient fine-tuning methods
S Mangrulkar, S Gugger, L Debut, Y Belkada, S Paul, B Bossan
Peft: State-of-the-art parameter-efficient fine-tuning methods, 2022
5712022
Datasets: A community library for natural language processing
Q Lhoest, AV Del Moral, Y Jernite, A Thakur, P Von Platen, S Patil, ...
arXiv preprint arXiv:2109.02846, 2021
3022021
Huggingface’s transformers: State-of-the-art natural language processing. arXiv 2019
T Wolf, L Debut, V Sanh, J Chaumond, C Delangue, A Moi, P Cistac, ...
arXiv preprint arXiv:1910.03771 10, 2020
2462020
DistilBERT, a distilled version of BERT: Smaller, faster, cheaper and lighter
S Victor, D Lysandre, C Julien, W Thomas
arXiv preprint arXiv:1910.01108 7, 2019
1262019
Distilbert, a distilled version of BERT: smaller, faster, cheaper and lighter. CoRR abs/1910.01108 (2019)
V Sanh, L Debut, J Chaumond, T Wolf
URL: http://arxiv. org/abs/1910 1108, 1910
1171910
Huggingface’s transformers: State-of-the-art natural language processing. arXiv
T Wolf, L Debut, V Sanh, J Chaumond, C Delangue, A Moi, P Cistac, ...
arXiv preprint arXiv:1910.03771 10, 2019
972019
Accelerate: Training and inference at scale made simple, efficient and adaptable
S Gugger, L Debut, T Wolf, P Schmid, Z Mueller, S Mangrulkar, M Sun, ...
Accelerate: Training and inference at scale made simple efficient and adaptable, 2022
692022
DistilBERT, a distilled version of BERT: Smaller, faster, cheaper and lighter. arXiv 2020
V Sanh, L Debut, J Chaumond, T Wolf
arXiv preprint arXiv:1910.01108, 1910
641910
Huggingface’s transformers: state-of-the-art natural language processing. CoRR abs/1910.03771 (2019)
T Wolf, L Debut, V Sanh, J Chaumond, C Delangue, A Moi, P Cistac, ...
URL: http://arxiv. org/abs/1910.03771, 1910
381910
Platen Patrick von, Ma Clara, Jernite Yacine, Plu Julien, Xu Canwen, Scao Teven Le, Gugger Sylvain, Drame Mariama, Lhoest Quentin, and Rush Alexander. 2020
W Thomas, D Lysandre, S Victor
Proceedings of the 2020 Conference on Empirical Methods in Natural Language …, 0
36
PEFT: state-of-the-art parameter-efficient fine-tuning methods (2022)
S Mangrulkar, S Gugger, L Debut, Y Belkada, S Paul, B Bossan
URL https://github. com/huggingface/peft, 2023
302023
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter.” arXiv, Feb. 29, 2020. doi: 10.48550
V Sanh, L Debut, J Chaumond, T Wolf
arxiv, 1910
301910
DistilBERT, a distilled version of BERT: Smaller, faster, cheaper and lighter (arXiv: 1910.01108). arXiv
V Sanh, L Debut, J Chaumond, T Wolf
Retrieved 2023-01-22, from http://arxiv. org/abs/1910.01108 doi: 10.48550/arXiv, 1910
201910
Distilbert, a distilled version of BERT: smaller, faster, cheaper and lighter. CoRR abs/1910.01108
V Sanh, L Debut, J Chaumond, T Wolf
191910
HuggingFace’s Transformers: State-of-the-art Natural Language Processing. arXiv e-prints
T Wolf, L Debut, V Sanh, J Chaumond, C Delangue, A Moi, P Cistac, ...
arXiv preprint arXiv:1910.03771 4, 2019
142019
HuggingFace’s Transformers: State-Of-The-Art Natural Language Processing. ArXiv191003771 Cs
T Wolf, L Debut, V Sanh, J Chaumond, C Delangue, A Moi, P Cistac, ...
122020
Accelerate: Training and inference at scale made simple, efficient and adaptable
G Sylvain, D Lysandre, W Thomas, S Philipp, M Zachary, M Sourab
62022
Datasets
T Wolf, Q Lhoest, P von Platen, Y Jernite, M Drame, J Plu, J Chaumond, ...
GitHub. Note: https://github. com/huggingface/datasets 1, 2020
52020
Systemet kan inte utföra åtgärden just nu. Försök igen senare.
Artiklar 1–20