Arabic span extraction-based reading comprehension benchmark (aser) and neural baseline models
Machine reading comprehension (MRC) requires machines to read and answer questions
about a given text. This can be achieved through either predicting answers or extracting …
about a given text. This can be achieved through either predicting answers or extracting …
Character-level adversarial attacks evaluation for arabert's
S Nakhleh, M Qasaimeh… - 2024 15th International …, 2024 - ieeexplore.ieee.org
Research has demonstrated the continuous growth of Large Language Models (LLMs) to
adversarial attacks, which involve the crafted input samples that can deceive even well …
adversarial attacks, which involve the crafted input samples that can deceive even well …
基于RoBERTa-wwm动态融合模型的中文电子病历命名实体识别*
张云秋, 汪洋, **博诚 - 数据分析与知识发现, 2022 - manu44.magtech.com.cn
[目的] 提出基于RoBERTa-wwm 动态融合的实体识别模型, 提高中文电子病历实体识别效果.[
方法] 将预训练语言模型RoBERTa-wwm 各Transformer 层生成的语义表示进行动态融合后 …
方法] 将预训练语言模型RoBERTa-wwm 各Transformer 层生成的语义表示进行动态融合后 …
Identifying named entities of Chinese electronic medical records based on RoBERTa-wwm dynamic fusion model
Z Yunqiu, W Yang, L Bocheng - Data Analysis and …, 2022 - manu44.magtech.com.cn
[Objective] This paper proposes an entity recognition model based on RoBERTa-wwm
dynamic fusion, aiming to improve the entity identification of Chinese electronic medical …
dynamic fusion, aiming to improve the entity identification of Chinese electronic medical …
Fine-Tuning BERT on Coarse-Grained Labels: Exploring Hidden States for Fine-Grained Classification
In recent years, pre-trained language models such as BERT (Bidirectional Encoder
Representations from Transformers) have demonstrated exceptional performance across …
Representations from Transformers) have demonstrated exceptional performance across …
Named Entity Recognition for Safety Risks in Electrical Work
P Liu, S Fan, X Cao, Y Jia… - 2024 IEEE International …, 2024 - ieeexplore.ieee.org
The texts of safety risks in eletrical work describe the safety risks involved and management
measures. The text records information on potential risks, risk levels, and preventive …
measures. The text records information on potential risks, risk levels, and preventive …
[PDF][PDF] AraBERT and mBert: Insights from Psycholinguistic Diagnostics
BERT, a groundbreaking large language model, has excelled in natural language
processing tasks such as question answering. Motivated by a desire to understand BERT's …
processing tasks such as question answering. Motivated by a desire to understand BERT's …
Fine-Tuning BERT on Coarse-Grained Labels: Exploring Hidden States
A Anjum¹, R Krestel - … on Applications of Natural Language to …, 2024 - books.google.com
In recent years, pre-trained language models such as BERT (Bidirectional Encoder
Representations from Transformers) have demonstrated exceptional performance across …
Representations from Transformers) have demonstrated exceptional performance across …
基于多层动态融合的中文医疗命名实体识别.
林令德, 刘纳, 徐贞顺, **昂… - Journal of Computer …, 2024 - search.ebscohost.com
针对基于预训练模型的命名实体识别方法仅使用了预训练模型最后一层隐状态,
忽略了各Transformer 层对应不同文本信息的问题, 提出一种预训练模型多层动态融合方法 …
忽略了各Transformer 层对应不同文本信息的问题, 提出一种预训练模型多层动态融合方法 …