Conda: Contrastive domain adaptation for ai-generated text detection
Large language models (LLMs) are increasingly being used for generating text in a variety
of use cases, including journalistic news articles. Given the potential malicious nature in …
of use cases, including journalistic news articles. Given the potential malicious nature in …
Retrieval-style in-context learning for few-shot hierarchical text classification
Hierarchical text classification (HTC) is an important task with broad applications, and few-
shot HTC has gained increasing interest recently. While in-context learning (ICL) with large …
shot HTC has gained increasing interest recently. While in-context learning (ICL) with large …
Effective structured prompting by meta-learning and representative verbalizer
Prompt tuning for pre-trained masked language models (MLM) has shown promising
performance in natural language processing tasks with few labeled examples. It tunes a …
performance in natural language processing tasks with few labeled examples. It tunes a …
Heterogeneous contrastive learning for foundation models and beyond
In the era of big data and Artificial Intelligence, an emerging paradigm is to utilize contrastive
self-supervised learning to model large-scale heterogeneous data. Many existing foundation …
self-supervised learning to model large-scale heterogeneous data. Many existing foundation …
MetaPrompting: Learning to learn better prompts
Prompting method is regarded as one of the crucial progress for few-shot nature language
processing. Recent research on prompting moves from discrete tokens based``hard …
processing. Recent research on prompting moves from discrete tokens based``hard …
SGCL-LncLoc: an interpretable deep learning model for improving IncRNA subcellular localization prediction with supervised graph contrastive learning
Understanding the subcellular localization of long non-coding RNAs (IncRNAs) is crucial for
unraveling their functional mechanisms. While previous computational methods have made …
unraveling their functional mechanisms. While previous computational methods have made …
MGML: Momentum group meta-learning for few-shot image classification
X Zhu, S Li - Neurocomputing, 2022 - Elsevier
At present, image classification covers more and more fields, and it is often difficult to obtain
enough data for learning in some specific scenarios, such as medical fields, personalized …
enough data for learning in some specific scenarios, such as medical fields, personalized …
Boosting few-shot text classification via distribution estimation
Distribution estimation has been demonstrated as one of the most effective approaches in
dealing with few-shot image classification, as the low-level patterns and underlying …
dealing with few-shot image classification, as the low-level patterns and underlying …
Supervised contrastive learning with hard negative samples
Through minimization of an appropriate loss function such as the InfoNCE loss, contrastive
learning (CL) learns a useful representation function by pulling positive samples close to …
learning (CL) learns a useful representation function by pulling positive samples close to …
Enhancing coherence and diversity in multi-class slogan generation systems
Many problems related to natural language processing are solved by neural networks and
big data. Researchers have previously focused on single-task supervised goals with limited …
big data. Researchers have previously focused on single-task supervised goals with limited …