Cris: Clip-driven referring image segmentation

Z Wang, Y Lu, Q Li, X Tao, Y Guo… - Proceedings of the …, 2022 - openaccess.thecvf.com
Referring image segmentation aims to segment a referent via a natural linguistic expression.
Due to the distinct data properties between text and image, it is challenging for a network to …

Twin contrastive learning with noisy labels

Z Huang, J Zhang, H Shan - Proceedings of the IEEE/CVF …, 2023 - openaccess.thecvf.com
Learning from noisy data is a challenging task that significantly degenerates the model
performance. In this paper, we present TCL, a novel twin contrastive learning model to learn …

Estimating noise transition matrix with label correlations for noisy multi-label learning

S Li, X **a, H Zhang, Y Zhan… - Advances in Neural …, 2022 - proceedings.neurips.cc
In label-noise learning, the noise transition matrix, bridging the class posterior for noisy and
clean data, has been widely exploited to learn statistically consistent classifiers. The …

Humantomato: Text-aligned whole-body motion generation

S Lu, LH Chen, A Zeng, J Lin, R Zhang, L Zhang… - arxiv preprint arxiv …, 2023 - arxiv.org
This work targets a novel text-driven whole-body motion generation task, which takes a
given textual description as input and aims at generating high-quality, diverse, and coherent …

Fine-grained classification with noisy labels

Q Wei, L Feng, H Sun, R Wang… - Proceedings of the …, 2023 - openaccess.thecvf.com
Learning with noisy labels (LNL) aims to ensure model generalization given a label-
corrupted training set. In this work, we investigate a rarely studied scenario of LNL on fine …

[HTML][HTML] Cross-to-merge training with class balance strategy for learning with noisy labels

Q Zhang, Y Zhu, M Yang, G **, YW Zhu… - Expert Systems with …, 2024 - Elsevier
The collection of large-scale datasets inevitably introduces noisy labels, leading to a
substantial degradation in the performance of deep neural networks (DNNs). Although …

Rank-n-contrast: Learning continuous representations for regression

K Zha, P Cao, J Son, Y Yang… - Advances in Neural …, 2023 - proceedings.neurips.cc
Deep regression models typically learn in an end-to-end fashion without explicitly
emphasizing a regression-aware representation. Consequently, the learned representations …

Opencon: Open-world contrastive learning

Y Sun, Y Li - arxiv preprint arxiv:2208.02764, 2022 - arxiv.org
Machine learning models deployed in the wild naturally encounter unlabeled samples from
both known and novel classes. Challenges arise in learning from both the labeled and …

Like draws to like: A multi-granularity ball-intra fusion approach for fault diagnosis models to resists misleading by noisy labels

F Dunkin, X Li, C Hu, G Wu, H Li, X Lu… - Advanced Engineering …, 2024 - Elsevier
Although data-driven fault diagnosis methods have achieved remarkable results, these
achievements often rely on high-quality datasets without noisy labels, which can mislead the …

Sample self-selection using dual teacher networks for pathological image classification with noisy labels

G Han, W Guo, H Zhang, J **, X Gan, X Zhao - Computers in biology and …, 2024 - Elsevier
Deep neural networks (DNNs) involve advanced image processing but depend on large
quantities of high-quality labeled data. The presence of noisy data significantly degrades the …