Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Learning from noisy labels with deep neural networks: A survey
Deep learning has achieved remarkable success in numerous domains with help from large
amounts of big data. However, the quality of data labels is a concern because of the lack of …
amounts of big data. However, the quality of data labels is a concern because of the lack of …
The METRIC-framework for assessing data quality for trustworthy AI in medicine: a systematic review
The adoption of machine learning (ML) and, more specifically, deep learning (DL)
applications into all major areas of our lives is underway. The development of trustworthy AI …
applications into all major areas of our lives is underway. The development of trustworthy AI …
Robust training under label noise by over-parameterization
Recently, over-parameterized deep networks, with increasingly more network parameters
than training samples, have dominated the performances of modern machine learning …
than training samples, have dominated the performances of modern machine learning …
Benchmarking uncertainty disentanglement: Specialized uncertainties for specialized tasks
Uncertainty quantification, once a singular task, has evolved into a spectrum of tasks,
including abstained prediction, out-of-distribution detection, and aleatoric uncertainty …
including abstained prediction, out-of-distribution detection, and aleatoric uncertainty …
Estimating noise transition matrix with label correlations for noisy multi-label learning
In label-noise learning, the noise transition matrix, bridging the class posterior for noisy and
clean data, has been widely exploited to learn statistically consistent classifiers. The …
clean data, has been widely exploited to learn statistically consistent classifiers. The …
Label-free node classification on graphs with large language models (llms)
In recent years, there have been remarkable advancements in node classification achieved
by Graph Neural Networks (GNNs). However, they necessitate abundant high-quality labels …
by Graph Neural Networks (GNNs). However, they necessitate abundant high-quality labels …
Fine-grained classification with noisy labels
Learning with noisy labels (LNL) aims to ensure model generalization given a label-
corrupted training set. In this work, we investigate a rarely studied scenario of LNL on fine …
corrupted training set. In this work, we investigate a rarely studied scenario of LNL on fine …
Targeted representation alignment for open-world semi-supervised learning
Abstract Open-world Semi-Supervised Learning aims to classify unlabeled samples utilizing
information from labeled data while unlabeled samples are not only from the labeled known …
information from labeled data while unlabeled samples are not only from the labeled known …
Asymmetric loss functions for noise-tolerant learning: Theory and applications
Supervised deep learning has achieved tremendous success in many computer vision
tasks, which however is prone to overfit noisy labels. To mitigate the undesirable influence of …
tasks, which however is prone to overfit noisy labels. To mitigate the undesirable influence of …
Robust data pruning under label noise via maximizing re-labeling accuracy
Data pruning, which aims to downsize a large training set into a small informative subset, is
crucial for reducing the enormous computational costs of modern deep learning. Though …
crucial for reducing the enormous computational costs of modern deep learning. Though …