P-tuning v2: Prompt tuning can be comparable to fine-tuning universally across scales and tasks

X Liu, K Ji, Y Fu, WL Tam, Z Du, Z Yang… - arxiv preprint arxiv …, 2021 - arxiv.org
Prompt tuning, which only tunes continuous prompts with a frozen language model,
substantially reduces per-task storage and memory usage at training. However, in the …

Parallel instance query network for named entity recognition

Y Shen, X Wang, Z Tan, G Xu, P **e, F Huang… - arxiv preprint arxiv …, 2022 - arxiv.org
Named entity recognition (NER) is a fundamental task in natural language processing.
Recent works treat named entity recognition as a reading comprehension task, constructing …

Uncovering main causalities for long-tailed information extraction

G Nan, J Zeng, R Qiao, Z Guo, W Lu - arxiv preprint arxiv:2109.05213, 2021 - arxiv.org
Information Extraction (IE) aims to extract structural information from unstructured texts. In
practice, long-tailed distributions caused by the selection bias of a dataset, may lead to …

To be closer: Learning to link up aspects with opinions

Y Zhou, L Liao, Y Gao, Z Jie, W Lu - arxiv preprint arxiv:2109.08382, 2021 - arxiv.org
Dependency parse trees are helpful for discovering the opinion words in aspect-based
sentiment analysis (ABSA). However, the trees obtained from off-the-shelf dependency …

Improving self-training for cross-lingual named entity recognition with contrastive and prototype learning

R Zhou, X Li, L Bing, E Cambria, C Miao - arxiv preprint arxiv:2305.13628, 2023 - arxiv.org
In cross-lingual named entity recognition (NER), self-training is commonly used to bridge the
linguistic gap by training on pseudo-labeled target-language data. However, due to sub …

[HTML][HTML] A graph neural network with context filtering and feature correction for conversational emotion recognition

C Gan, J Zheng, Q Zhu, DK Jain, V Štruc - Information Sciences, 2024 - Elsevier
Conversational emotion recognition represents an important machine-learning problem with
a wide variety of deployment possibilities. The key challenge in this area is how to properly …

What do we Really Know about State of the Art NER?

S Vajjala, R Balasubramaniam - arxiv preprint arxiv:2205.00034, 2022 - arxiv.org
Named Entity Recognition (NER) is a well researched NLP task and is widely used in real
world NLP scenarios. NER research typically focuses on the creation of new ways of training …

Hero-gang neural model for named entity recognition

J Hu, Y Shen, Y Liu, X Wan, TH Chang - arxiv preprint arxiv:2205.07177, 2022 - arxiv.org
Named entity recognition (NER) is a fundamental and important task in NLP, aiming at
identifying named entities (NEs) from free text. Recently, since the multi-head attention …

ConNER: Consistency training for cross-lingual named entity recognition

R Zhou, X Li, L Bing, E Cambria, L Si… - arxiv preprint arxiv …, 2022 - arxiv.org
Cross-lingual named entity recognition (NER) suffers from data scarcity in the target
languages, especially under zero-shot settings. Existing translate-train or knowledge …

Semantic role labeling as dependency parsing: Exploring latent tree structures inside arguments

Y Zhang, Q **a, S Zhou, Y Jiang, G Fu… - arxiv preprint arxiv …, 2021 - arxiv.org
Semantic role labeling (SRL) is a fundamental yet challenging task in the NLP community.
Recent works of SRL mainly fall into two lines: 1) BIO-based; 2) span-based. Despite …