Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
[HTML][HTML] Ptr: Prompt tuning with rules for text classification
Recently, prompt tuning has been widely applied to stimulate the rich knowledge in pre-
trained language models (PLMs) to serve NLP tasks. Although prompt tuning has achieved …
trained language models (PLMs) to serve NLP tasks. Although prompt tuning has achieved …
Few-shot intent detection via contrastive pre-training and fine-tuning
In this work, we focus on a more challenging few-shot intent detection scenario where many
intents are fine-grained and semantically similar. We present a simple yet effective few-shot …
intents are fine-grained and semantically similar. We present a simple yet effective few-shot …
New intent discovery with pre-training and contrastive learning
New intent discovery aims to uncover novel intent categories from user utterances to expand
the set of supported intent classes. It is a critical task for the development and service …
the set of supported intent classes. It is a critical task for the development and service …
Out-of-scope intent detection with self-supervision and discriminative training
Out-of-scope intent detection is of practical importance in task-oriented dialogue systems.
Since the distribution of outlier utterances is arbitrary and unknown in the training stage …
Since the distribution of outlier utterances is arbitrary and unknown in the training stage …
Multitask learning for multilingual intent detection and slot filling in dialogue systems
Dialogue systems are becoming an ubiquitous presence in our everyday lives having a
huge impact on business and society. Spoken language understanding (SLU) is the critical …
huge impact on business and society. Spoken language understanding (SLU) is the critical …
Label semantic aware pre-training for few-shot text classification
In text classification tasks, useful information is encoded in the label names. Label semantic
aware systems have leveraged this information for improved text classification performance …
aware systems have leveraged this information for improved text classification performance …
Space-2: Tree-structured semi-supervised contrastive pre-training for task-oriented dialog understanding
Pre-training methods with contrastive learning objectives have shown remarkable success
in dialog understanding tasks. However, current contrastive learning solely considers the …
in dialog understanding tasks. However, current contrastive learning solely considers the …
Incremental few-shot text classification with multi-round new classes: Formulation, dataset and system
Text classification is usually studied by labeling natural language texts with relevant
categories from a predefined set. In the real world, new classes might keep challenging the …
categories from a predefined set. In the real world, new classes might keep challenging the …
Neural data augmentation via example extrapolation
In many applications of machine learning, certain categories of examples may be
underrepresented in the training data, causing systems to underperform on such" few-shot" …
underrepresented in the training data, causing systems to underperform on such" few-shot" …
Effectiveness of pre-training for few-shot intent classification
This paper investigates the effectiveness of pre-training for few-shot intent classification.
While existing paradigms commonly further pre-train language models such as BERT on a …
While existing paradigms commonly further pre-train language models such as BERT on a …