[HTML][HTML] Ptr: Prompt tuning with rules for text classification

X Han, W Zhao, N Ding, Z Liu, M Sun - AI Open, 2022 - Elsevier
Recently, prompt tuning has been widely applied to stimulate the rich knowledge in pre-
trained language models (PLMs) to serve NLP tasks. Although prompt tuning has achieved …

Few-shot intent detection via contrastive pre-training and fine-tuning

J Zhang, T Bui, S Yoon, X Chen, Z Liu, C **a… - arxiv preprint arxiv …, 2021 - arxiv.org
In this work, we focus on a more challenging few-shot intent detection scenario where many
intents are fine-grained and semantically similar. We present a simple yet effective few-shot …

New intent discovery with pre-training and contrastive learning

Y Zhang, H Zhang, LM Zhan, XM Wu, A Lam - arxiv preprint arxiv …, 2022 - arxiv.org
New intent discovery aims to uncover novel intent categories from user utterances to expand
the set of supported intent classes. It is a critical task for the development and service …

Out-of-scope intent detection with self-supervision and discriminative training

LM Zhan, H Liang, B Liu, L Fan, XM Wu… - arxiv preprint arxiv …, 2021 - arxiv.org
Out-of-scope intent detection is of practical importance in task-oriented dialogue systems.
Since the distribution of outlier utterances is arbitrary and unknown in the training stage …

Multitask learning for multilingual intent detection and slot filling in dialogue systems

M Firdaus, A Ekbal, E Cambria - Information Fusion, 2023 - Elsevier
Dialogue systems are becoming an ubiquitous presence in our everyday lives having a
huge impact on business and society. Spoken language understanding (SLU) is the critical …

Label semantic aware pre-training for few-shot text classification

A Mueller, J Krone, S Romeo, S Mansour… - arxiv preprint arxiv …, 2022 - arxiv.org
In text classification tasks, useful information is encoded in the label names. Label semantic
aware systems have leveraged this information for improved text classification performance …

Space-2: Tree-structured semi-supervised contrastive pre-training for task-oriented dialog understanding

W He, Y Dai, B Hui, M Yang, Z Cao, J Dong… - arxiv preprint arxiv …, 2022 - arxiv.org
Pre-training methods with contrastive learning objectives have shown remarkable success
in dialog understanding tasks. However, current contrastive learning solely considers the …

Incremental few-shot text classification with multi-round new classes: Formulation, dataset and system

C **a, W Yin, Y Feng, P Yu - arxiv preprint arxiv:2104.11882, 2021 - arxiv.org
Text classification is usually studied by labeling natural language texts with relevant
categories from a predefined set. In the real world, new classes might keep challenging the …

Neural data augmentation via example extrapolation

K Lee, K Guu, L He, T Dozat, HW Chung - arxiv preprint arxiv:2102.01335, 2021 - arxiv.org
In many applications of machine learning, certain categories of examples may be
underrepresented in the training data, causing systems to underperform on such" few-shot" …

Effectiveness of pre-training for few-shot intent classification

H Zhang, Y Zhang, LM Zhan, J Chen, G Shi… - arxiv preprint arxiv …, 2021 - arxiv.org
This paper investigates the effectiveness of pre-training for few-shot intent classification.
While existing paradigms commonly further pre-train language models such as BERT on a …