팔로우
Thomas Wolf
Thomas Wolf
Co-founder at HuggingFace
polytechnique.edu의 이메일 확인됨 - 홈페이지
제목
인용
인용
연도
Transformers: State-of-the-Art Natural Language Processing
T Wolf
arXiv preprint arXiv:1910.03771, 2020
16878*2020
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
V Sanh, L Debut, J Chaumond, T Wolf
arXiv preprint arXiv:1910.01108, 2019
87732019
Multitask prompted training enables zero-shot task generalization
V Sanh, A Webson, C Raffel, SH Bach, L Sutawika, Z Alyafeai, A Chaffin, ...
arXiv preprint arXiv:2110.08207, 2021
17942021
Bloom: A 176b-parameter open-access multilingual language model
T Le Scao, A Fan, C Akiki, E Pavlick, S Ilić, D Hesslow, R Castagné, ...
17472023
Starcoder: may the source be with you!
R Li, LB Allal, Y Zi, N Muennighoff, D Kocetkov, C Mou, M Marone, C Akiki, ...
arXiv preprint arXiv:2305.06161, 2023
8812023
Transfer learning in natural language processing
S Ruder, ME Peters, S Swayamdipta, T Wolf
Proceedings of the 2019 conference of the North American chapter of the …, 2019
8152019
Datasets: A community library for natural language processing
Q Lhoest, AV Del Moral, Y Jernite, A Thakur, P Von Platen, S Patil, ...
arXiv preprint arXiv:2109.02846, 2021
602*2021
Transfertransfo: A transfer learning approach for neural network based conversational agents
T Wolf, V Sanh, J Chaumond, C Delangue
arXiv preprint arXiv:1901.08149, 2019
5422019
Movement pruning: Adaptive sparsity by fine-tuning
V Sanh, T Wolf, A Rush
Advances in neural information processing systems 33, 20378-20389, 2020
4902020
Zephyr: Direct distillation of lm alignment
L Tunstall, E Beeching, N Lambert, N Rajani, K Rasul, Y Belkada, ...
arXiv preprint arXiv:2310.16944, 2023
4732023
Diffusers: State-of-the-art diffusion models
P Von Platen, S Patil, A Lozhkov, P Cuenca, N Lambert, K Rasul, ...
4382022
Natural language processing with transformers
L Tunstall, L Von Werra, T Wolf
" O'Reilly Media, Inc.", 2022
4352022
Two-dimensional superconductivity at a Mott insulator/band insulator interface LaTiO3/SrTiO3
J Biscaras, N Bergeal, A Kushwaha, T Wolf, A Rastogi, RC Budhani, ...
Nature communications 1 (1), 89, 2010
3552010
Open llm leaderboard
E Beeching, C Fourrier, N Habib, S Han, N Lambert, N Rajani, ...
Hugging Face, 2023
3002023
A hierarchical multi-task approach for learning embeddings from semantic tasks
V Sanh, T Wolf, S Ruder
Proceedings of the AAAI conference on artificial intelligence 33 (01), 6949-6956, 2019
2832019
The stack: 3 tb of permissively licensed source code
D Kocetkov, R Li, LB Allal, J Li, C Mou, CM Ferrandis, Y Jernite, M Mitchell, ...
arXiv preprint arXiv:2211.15533, 2022
2742022
Scaling data-constrained language models
N Muennighoff, A Rush, B Barak, T Le Scao, N Tazi, A Piktus, S Pyysalo, ...
Advances in Neural Information Processing Systems 36, 50358-50376, 2023
2202023
Starcoder 2 and the stack v2: The next generation
A Lozhkov, R Li, LB Allal, F Cassano, J Lamy-Poirier, N Tazi, A Tang, ...
arXiv preprint arXiv:2402.19173, 2024
1882024
Grounding large language models in interactive environments with online reinforcement learning
T Carta, C Romac, T Wolf, S Lamprier, O Sigaud, PY Oudeyer
International Conference on Machine Learning, 3676-3713, 2023
1602023
Distilbert, a distilled version of BERT: smaller, faster, cheaper and lighter. CoRR abs/1910.01108 (2019)
V Sanh, L Debut, J Chaumond, T Wolf
URL: http://arxiv. org/abs/1910 1108, 1910
1211910
현재 시스템이 작동되지 않습니다. 나중에 다시 시도해 주세요.
학술자료 1–20