Bloomberggpt: A large language model for finance S Wu, O Irsoy, S Lu, V Dabravolski, M Dredze, S Gehrmann, P Kambadur, ... arXiv preprint arXiv:2303.17564, 2023 | 924 | 2023 |
Beto, Bentz, Becas: The Surprising Cross-Lingual Effectiveness of BERT S Wu, M Dredze EMNLP 2019, 2019 | 779 | 2019 |
Are All Languages Created Equal in Multilingual BERT? S Wu, M Dredze RepL4NLP Workshop 2020, 2020 | 359 | 2020 |
Emerging Cross-lingual Structure in Pretrained Language Models S Wu, A Conneau, H Li, L Zettlemoyer, V Stoyanov ACL 2020, 2019 | 281 | 2019 |
The SIGMORPHON 2019 shared task: Crosslinguality and context in morphology AD McCarthy, E Vylomova, S Wu, C Malaviya, L Wolf-Sonkin, G Nicolai, ... SIGMORPHON Workshop 2019, 2019 | 126* | 2019 |
Applying The Transformer To Character-level Transduction S Wu, R Cotterell, M Hulden EACL 2021, 2020 | 89 | 2020 |
Which* BERT? A Survey Organizing Contextualized Encoders P Xia, S Wu, B Van Durme EMNLP 2020, 2020 | 66 | 2020 |
SIGMORPHON 2020 shared task 0: Typologically diverse morphological inflection E Vylomova, J White, E Salesky, SJ Mielke, S Wu, E Ponti, RH Maudslay, ... SIGMORPHON Workshop 2020, 2020 | 65 | 2020 |
The SIGMORPHON 2020 shared task on multilingual grapheme-to-phoneme conversion K Gorman, LFE Ashby, A Goyzueta, AD McCarthy, S Wu, D You SIGMORPHON Workshop 2020, 2020 | 62 | 2020 |
Exact Hard Monotonic Attention for Character-Level Transduction S Wu, R Cotterell ACL 2019, 2019 | 62 | 2019 |
Hard Non-Monotonic Attention for Character-Level Transduction S Wu, P Shapiro, R Cotterell EMNLP 2018, 2018 | 54 | 2018 |
Do Explicit Alignments Robustly Improve Multilingual Encoders? S Wu, M Dredze EMNLP 2020, 2020 | 45 | 2020 |
BloombergGPT: A large language model for finance, 2023 S Wu, O Irsoy, S Lu, V Dabravolski, M Dredze, S Gehrmann, P Kambadur, ... URL https://arxiv. org/abs/2303.17564, 2024 | 44 | 2024 |
Bernice: A Multilingual Pre-trained Encoder for Twitter A DeLucia, S Wu, A Mueller, C Aguirre, P Resnik, M Dredze EMNLP 2022, 6191-6205, 2022 | 44 | 2022 |
Everything is all it takes: A multipronged strategy for zero-shot cross-lingual information extraction M Yarmohammadi, S Wu, M Marone, H Xu, S Ebner, G Qin, Y Chen, ... EMNLP 2021, 2021 | 38 | 2021 |
A Simple Joint Model for Improved Contextual Neural Lemmatization C Malaviya, S Wu, R Cotterell NAACL 2019, 2019 | 36 | 2019 |
Cross-lingual Few-Shot Learning on Unseen Languages G Winata, S Wu, M Kulkarni, T Solorio, D Preoţiuc-Pietro AACL 2022, 2022 | 34 | 2022 |
Morphological Irregularity Correlates with Frequency S Wu, R Cotterell, TJ O'Donnell ACL 2019, 2019 | 33 | 2019 |
Sigmorphon 2021 shared task on morphological reinflection: Generalization across languages T Pimentel, M Ryskina, SJ Mielke, S Wu, E Chodroff, B Leonard, G Nicolai, ... SIGMORPHON Workshop 2021, 2021 | 29 | 2021 |
Overcoming Catastrophic Forgetting in Massively Multilingual Continual Learning GI Winata, L Xie, K Radhakrishnan, S Wu, X Jin, P Cheng, M Kulkarni, ... ACL 2023 (Findings), 2023 | 20 | 2023 |