Nested named entity recognition: a survey
With the rapid development of text mining, many studies observe that text generally contains
a variety of implicit information, and it is important to develop techniques for extracting such …
a variety of implicit information, and it is important to develop techniques for extracting such …
A survey on arabic named entity recognition: Past, recent advances, and future trends
As more and more Arabic texts emerged on the Internet, extracting important information
from these Arabic texts is especially useful. As a fundamental technology, Named entity …
from these Arabic texts is especially useful. As a fundamental technology, Named entity …
Template-free prompt tuning for few-shot NER
Prompt-based methods have been successfully applied in sentence-level few-shot learning
tasks, mostly owing to the sophisticated design of templates and label words. However …
tasks, mostly owing to the sophisticated design of templates and label words. However …
A survey on programmatic weak supervision
Labeling training data has become one of the major roadblocks to using machine learning.
Among various weak supervision paradigms, programmatic weak supervision (PWS) has …
Among various weak supervision paradigms, programmatic weak supervision (PWS) has …
WRENCH: A comprehensive benchmark for weak supervision
Recent Weak Supervision (WS) approaches have had widespread success in easing the
bottleneck of labeling training data for machine learning by synthesizing labels from multiple …
bottleneck of labeling training data for machine learning by synthesizing labels from multiple …
SumGNN: multi-typed drug interaction prediction via efficient knowledge graph summarization
Motivation Thanks to the increasing availability of drug–drug interactions (DDI) datasets and
large biomedical knowledge graphs (KGs), accurate detection of adverse DDI using …
large biomedical knowledge graphs (KGs), accurate detection of adverse DDI using …
Assessing generalizability of codebert
Pre-trained models like BERT have achieved strong improvements on many natural
language processing (NLP) tasks, showing their great generalizability. The success of pre …
language processing (NLP) tasks, showing their great generalizability. The success of pre …
Fine-tuning pre-trained language model with weak supervision: A contrastive-regularized self-training approach
Fine-tuned pre-trained language models (LMs) have achieved enormous success in many
natural language processing (NLP) tasks, but they still require excessive labeled data in the …
natural language processing (NLP) tasks, but they still require excessive labeled data in the …
Meta self-training for few-shot neural sequence labeling
Neural sequence labeling is widely adopted for many Natural Language Processing (NLP)
tasks, such as Named Entity Recognition (NER) and slot tagging for dialog systems and …
tasks, such as Named Entity Recognition (NER) and slot tagging for dialog systems and …
Distantly-supervised named entity recognition with noise-robust learning and language model augmented self-training
We study the problem of training named entity recognition (NER) models using only distantly-
labeled data, which can be automatically obtained by matching entity mentions in the raw …
labeled data, which can be automatically obtained by matching entity mentions in the raw …