Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Daso: Distribution-aware semantics-oriented pseudo-label for imbalanced semi-supervised learning
The capability of the traditional semi-supervised learning (SSL) methods is far from real-
world application due to severely biased pseudo-labels caused by (1) class imbalance and …
world application due to severely biased pseudo-labels caused by (1) class imbalance and …
A graph-theoretic framework for understanding open-world semi-supervised learning
Open-world semi-supervised learning aims at inferring both known and novel classes in
unlabeled data, by harnessing prior knowledge from a labeled set with known classes …
unlabeled data, by harnessing prior knowledge from a labeled set with known classes …
Robust semi-supervised learning by wisely leveraging open-set data
Open-set Semi-supervised Learning (OSSL) holds a realistic setting that unlabeled data
may come from classes unseen in the labeled set, ie, out-of-distribution (OOD) data, which …
may come from classes unseen in the labeled set, ie, out-of-distribution (OOD) data, which …
Opencon: Open-world contrastive learning
Machine learning models deployed in the wild naturally encounter unlabeled samples from
both known and novel classes. Challenges arise in learning from both the labeled and …
both known and novel classes. Challenges arise in learning from both the labeled and …
Unified dialog model pre-training for task-oriented dialog understanding and generation
Recently, pre-training methods have shown remarkable success in task-oriented dialog
(TOD) systems. However, most existing pre-trained models for TOD focus on either dialog …
(TOD) systems. However, most existing pre-trained models for TOD focus on either dialog …
S-clip: Semi-supervised vision-language learning using few specialist captions
Vision-language models, such as contrastive language-image pre-training (CLIP), have
demonstrated impressive results in natural image domains. However, these models often …
demonstrated impressive results in natural image domains. However, these models often …
Space-2: Tree-structured semi-supervised contrastive pre-training for task-oriented dialog understanding
Pre-training methods with contrastive learning objectives have shown remarkable success
in dialog understanding tasks. However, current contrastive learning solely considers the …
in dialog understanding tasks. However, current contrastive learning solely considers the …
MarginMatch: Improving semi-supervised learning with pseudo-margins
We introduce MarginMatch, a new SSL approach combining consistency regularization and
pseudo-labeling, with its main novelty arising from the use of unlabeled data training …
pseudo-labeling, with its main novelty arising from the use of unlabeled data training …
Ssb: Simple but strong baseline for boosting performance of open-set semi-supervised learning
Semi-supervised learning (SSL) methods effectively leverage unlabeled data to improve
model generalization. However, SSL models often underperform in open-set scenarios …
model generalization. However, SSL models often underperform in open-set scenarios …
Semi-supervised learning via weight-aware distillation under class distribution mismatch
Abstract Semi-Supervised Learning (SSL) under class distribution mismatch aims to tackle a
challenging problem wherein unlabeled data contain lots of unknown categories unseen in …
challenging problem wherein unlabeled data contain lots of unknown categories unseen in …