Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Cris: Clip-driven referring image segmentation
Referring image segmentation aims to segment a referent via a natural linguistic expression.
Due to the distinct data properties between text and image, it is challenging for a network to …
Due to the distinct data properties between text and image, it is challenging for a network to …
Twin contrastive learning with noisy labels
Learning from noisy data is a challenging task that significantly degenerates the model
performance. In this paper, we present TCL, a novel twin contrastive learning model to learn …
performance. In this paper, we present TCL, a novel twin contrastive learning model to learn …
Estimating noise transition matrix with label correlations for noisy multi-label learning
In label-noise learning, the noise transition matrix, bridging the class posterior for noisy and
clean data, has been widely exploited to learn statistically consistent classifiers. The …
clean data, has been widely exploited to learn statistically consistent classifiers. The …
Humantomato: Text-aligned whole-body motion generation
This work targets a novel text-driven whole-body motion generation task, which takes a
given textual description as input and aims at generating high-quality, diverse, and coherent …
given textual description as input and aims at generating high-quality, diverse, and coherent …
Fine-grained classification with noisy labels
Learning with noisy labels (LNL) aims to ensure model generalization given a label-
corrupted training set. In this work, we investigate a rarely studied scenario of LNL on fine …
corrupted training set. In this work, we investigate a rarely studied scenario of LNL on fine …
[HTML][HTML] Cross-to-merge training with class balance strategy for learning with noisy labels
The collection of large-scale datasets inevitably introduces noisy labels, leading to a
substantial degradation in the performance of deep neural networks (DNNs). Although …
substantial degradation in the performance of deep neural networks (DNNs). Although …
Rank-n-contrast: Learning continuous representations for regression
Deep regression models typically learn in an end-to-end fashion without explicitly
emphasizing a regression-aware representation. Consequently, the learned representations …
emphasizing a regression-aware representation. Consequently, the learned representations …
Opencon: Open-world contrastive learning
Machine learning models deployed in the wild naturally encounter unlabeled samples from
both known and novel classes. Challenges arise in learning from both the labeled and …
both known and novel classes. Challenges arise in learning from both the labeled and …
Like draws to like: A multi-granularity ball-intra fusion approach for fault diagnosis models to resists misleading by noisy labels
Although data-driven fault diagnosis methods have achieved remarkable results, these
achievements often rely on high-quality datasets without noisy labels, which can mislead the …
achievements often rely on high-quality datasets without noisy labels, which can mislead the …
Sample self-selection using dual teacher networks for pathological image classification with noisy labels
G Han, W Guo, H Zhang, J **, X Gan, X Zhao - Computers in biology and …, 2024 - Elsevier
Deep neural networks (DNNs) involve advanced image processing but depend on large
quantities of high-quality labeled data. The presence of noisy data significantly degrades the …
quantities of high-quality labeled data. The presence of noisy data significantly degrades the …