[HTML][HTML] BERT models for Arabic text classification: a systematic review

AS Alammary - Applied Sciences, 2022 - mdpi.com
Bidirectional Encoder Representations from Transformers (BERT) has gained increasing
attention from researchers and practitioners as it has proven to be an invaluable technique …

AraT5: Text-to-text transformers for Arabic language generation

AR Elmadany, M Abdul-Mageed - … of the 60th annual meeting of …, 2022 - aclanthology.org
Transfer learning with a unified Transformer framework (T5) that converts all language
problems into a text-to-text format was recently proposed as a simple and effective transfer …

A comprehensive review on transformers models for text classification

R Kora, A Mohammed - 2023 International Mobile, Intelligent …, 2023 - ieeexplore.ieee.org
The rapid progress in deep learning has propelled transformer-based models to the
forefront, establishing them as leading solutions for a multiple NLP tasks. These tasks span …

AraELECTRA: Pre-training text discriminators for Arabic language understanding

W Antoun, F Baly, H Hajj - arxiv preprint arxiv:2012.15516, 2020 - arxiv.org
Advances in English language representation enabled a more sample-efficient pre-training
task by Efficiently Learning an Encoder that Classifies Token Replacements Accurately …

AraGPT2: Pre-trained transformer for Arabic language generation

W Antoun, F Baly, H Hajj - arxiv preprint arxiv:2012.15520, 2020 - arxiv.org
Recently, pre-trained transformer-based architectures have proven to be very efficient at
language modeling and understanding, given that they are trained on a large enough …

NADI 2022: The third nuanced Arabic dialect identification shared task

M Abdul-Mageed, C Zhang, AR Elmadany… - arxiv preprint arxiv …, 2022 - arxiv.org
We describe findings of the third Nuanced Arabic Dialect Identification Shared Task (NADI
2022). NADI aims at advancing state of the art Arabic NLP, including on Arabic dialects. It …

AraT5: Text-to-text transformers for Arabic language generation

EMB Nagoudi, AR Elmadany… - arxiv preprint arxiv …, 2021 - arxiv.org
Transfer learning with a unified Transformer framework (T5) that converts all language
problems into a text-to-text format was recently proposed as a simple and effective transfer …

Tarjamat: Evaluation of bard and chatgpt on machine translation of ten arabic varieties

K Kadaoui, SM Magdy, A Waheed… - arxiv preprint arxiv …, 2023 - arxiv.org
Despite the purported multilingual proficiency of instruction-finetuned large language
models (LLMs) such as ChatGPT and Bard, the linguistic inclusivity of these models remains …

AfroLID: A neural language identification tool for African languages

I Adebara, AR Elmadany, M Abdul-Mageed… - arxiv preprint arxiv …, 2022 - arxiv.org
Language identification (LID) is a crucial precursor for NLP, especially for mining web data.
Problematically, most of the world's 7000+ languages today are not covered by LID …

Arabic dialect identification under scrutiny: Limitations of single-label classification

A Keleg, W Magdy - arxiv preprint arxiv:2310.13661, 2023 - arxiv.org
Automatic Arabic Dialect Identification (ADI) of text has gained great popularity since it was
introduced in the early 2010s. Multiple datasets were developed, and yearly shared tasks …