Daso: Distribution-aware semantics-oriented pseudo-label for imbalanced semi-supervised learning

Y Oh, DJ Kim, IS Kweon - … of the IEEE/CVF conference on …, 2022 - openaccess.thecvf.com
The capability of the traditional semi-supervised learning (SSL) methods is far from real-
world application due to severely biased pseudo-labels caused by (1) class imbalance and …

A graph-theoretic framework for understanding open-world semi-supervised learning

Y Sun, Z Shi, Y Li - Advances in Neural Information …, 2023 - proceedings.neurips.cc
Open-world semi-supervised learning aims at inferring both known and novel classes in
unlabeled data, by harnessing prior knowledge from a labeled set with known classes …

Robust semi-supervised learning by wisely leveraging open-set data

Y Yang, N Jiang, Y Xu, DC Zhan - IEEE Transactions on Pattern …, 2024 - ieeexplore.ieee.org
Open-set Semi-supervised Learning (OSSL) holds a realistic setting that unlabeled data
may come from classes unseen in the labeled set, ie, out-of-distribution (OOD) data, which …

Opencon: Open-world contrastive learning

Y Sun, Y Li - arxiv preprint arxiv:2208.02764, 2022 - arxiv.org
Machine learning models deployed in the wild naturally encounter unlabeled samples from
both known and novel classes. Challenges arise in learning from both the labeled and …

Unified dialog model pre-training for task-oriented dialog understanding and generation

W He, Y Dai, M Yang, J Sun, F Huang, L Si… - Proceedings of the 45th …, 2022 - dl.acm.org
Recently, pre-training methods have shown remarkable success in task-oriented dialog
(TOD) systems. However, most existing pre-trained models for TOD focus on either dialog …

S-clip: Semi-supervised vision-language learning using few specialist captions

S Mo, M Kim, K Lee, J Shin - Advances in Neural …, 2023 - proceedings.neurips.cc
Vision-language models, such as contrastive language-image pre-training (CLIP), have
demonstrated impressive results in natural image domains. However, these models often …

Space-2: Tree-structured semi-supervised contrastive pre-training for task-oriented dialog understanding

W He, Y Dai, B Hui, M Yang, Z Cao, J Dong… - arxiv preprint arxiv …, 2022 - arxiv.org
Pre-training methods with contrastive learning objectives have shown remarkable success
in dialog understanding tasks. However, current contrastive learning solely considers the …

MarginMatch: Improving semi-supervised learning with pseudo-margins

T Sosea, C Caragea - … of the IEEE/CVF conference on …, 2023 - openaccess.thecvf.com
We introduce MarginMatch, a new SSL approach combining consistency regularization and
pseudo-labeling, with its main novelty arising from the use of unlabeled data training …

Ssb: Simple but strong baseline for boosting performance of open-set semi-supervised learning

Y Fan, A Kukleva, D Dai… - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
Semi-supervised learning (SSL) methods effectively leverage unlabeled data to improve
model generalization. However, SSL models often underperform in open-set scenarios …

Semi-supervised learning via weight-aware distillation under class distribution mismatch

P Du, S Zhao, Z Sheng, C Li… - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
Abstract Semi-Supervised Learning (SSL) under class distribution mismatch aims to tackle a
challenging problem wherein unlabeled data contain lots of unknown categories unseen in …