Pssat: A perturbed semantic structure awareness transferring method for perturbation-robust slot filling

G Dong, D Guo, L Wang, X Li, Z Wang, C Zeng… - arxiv preprint arxiv …, 2022 - arxiv.org
Most existing slot filling models tend to memorize inherent patterns of entities and
corresponding contexts from training data. However, these models can lead to system failure …

Towards building more robust ner datasets: An empirical study on ner dataset bias from a dataset difficulty view

R Ma, X Wang, X Zhou, Q Zhang… - Proceedings of the 2023 …, 2023 - aclanthology.org
Recently, many studies have illustrated the robustness problem of Named Entity
Recognition (NER) systems: the NER models often rely on superficial entity patterns for …

Span-based named entity recognition by generating and compressing information

NTH Nguyen, M Miwa, S Ananiadou - arxiv preprint arxiv:2302.05392, 2023 - arxiv.org
The information bottleneck (IB) principle has been proven effective in various NLP
applications. The existing work, however, only used either generative or information …

Linkner: Linking local named entity recognition models to large language models using uncertainty

Z Zhang, Y Zhao, H Gao, M Hu - Proceedings of the ACM on Web …, 2024 - dl.acm.org
Named Entity Recognition (NER) serves as a fundamental task in natural language
understanding, bearing direct implications for web content analysis, search engines, and …

Farewell to aimless large-scale pretraining: Influential subset selection for language model

X Wang, W Zhou, Q Zhang, J Zhou, S Gao… - arxiv preprint arxiv …, 2023 - arxiv.org
Pretrained language models have achieved remarkable success in various natural
language processing tasks. However, pretraining has recently shifted toward larger models …