Improving language plasticity via pretraining with active forgetting
Pretrained language models (PLMs) are today the primary model for natural language
processing. Despite their impressive downstream performance, it can be difficult to apply …
processing. Despite their impressive downstream performance, it can be difficult to apply …
Aya model: An instruction finetuned open-access multilingual language model
A Üstün, V Aryabumi, ZX Yong, WY Ko… - ar** and Deploying End‐to‐End Machine Learning Systems for Social Impact: A Rubric and Practical Artificial Intelligence Case Studies From African Contexts
Artificial intelligence (AI) and machine learning have demonstrated the potential to provide
solutions to societal challenges, for example, automated crop diagnostics for smallholder …
solutions to societal challenges, for example, automated crop diagnostics for smallholder …
Afrinames: Most asr models" butcher" african names
Useful conversational agents must accurately capture named entities to minimize error for
downstream tasks, for example, asking a voice assistant to play a track from a certain artist …
downstream tasks, for example, asking a voice assistant to play a track from a certain artist …
CoLaDa: A Collaborative Label Denoising Framework for Cross-lingual Named Entity Recognition
Cross-lingual named entity recognition (NER) aims to train an NER system that generalizes
well to a target language by leveraging labeled data in a given source language. Previous …
well to a target language by leveraging labeled data in a given source language. Previous …
PuoBERTa: Training and evaluation of a curated language model for Setswana
Natural language processing (NLP) has made significant progress for well-resourced
languages such as English but lagged behind for low-resource languages like Setswana …
languages such as English but lagged behind for low-resource languages like Setswana …