Jais and jais-chat: Arabic-centric foundation and instruction-tuned open generative large language models

N Sengupta, SK Sahu, B Jia, S Katipomu, H Li… - ar** monolingual large Pre-trained Language Models (PLMs) is shown to be very
successful in handling different tasks in Natural Language Processing (NLP). In this work …

Octopus: A Multitask Model and Toolkit for Arabic Natural Language Generation

AR Elmadany, EMB Nagoudi… - arxiv preprint arxiv …, 2023 - arxiv.org
Understanding Arabic text and generating human-like responses is a challenging endeavor.
While many researchers have proposed models and solutions for individual problems, there …

DialectNLU at NADI 2023 Shared Task: Transformer Based Multitask Approach Jointly Integrating Dialect and Machine Translation Tasks in Arabic

H Veeramani, S Thapa, U Naseem - Proceedings of ArabicNLP …, 2023 - aclanthology.org
With approximately 400 million speakers worldwide, Arabic ranks as the fifth most-spoken
language globally, necessitating advancements in natural language processing. This paper …

Dolphin: A Challenging and Diverse Benchmark for Arabic NLG

A Elmadany, A El-Shangiti… - Findings of the …, 2023 - aclanthology.org
We present Dolphin, a novel benchmark that addresses the need for a natural language
generation (NLG) evaluation framework dedicated to the wide collection of Arabic …

ARABIC QUESTION ANSWERING ON THE HOLY QUR'AN

RR Malhas - 2023 - qspace.qu.edu.qa
In this dissertation, we address the need for an intelligent machine reading at scale (MRS)
Question Answering (QA) system on the Holy Qur'an, given the permanent interest of …

Sequence-to-Sequence Spanish Pre-trained Language Models

V Araujo, MM Trusca, R Tufiño, MF Moens - arxiv preprint arxiv …, 2023 - arxiv.org
In recent years, substantial advancements in pre-trained language models have paved the
way for the development of numerous non-English language versions, with a particular …