Learning from noisy labels with deep neural networks: A survey

H Song, M Kim, D Park, Y Shin… - IEEE transactions on …, 2022 - ieeexplore.ieee.org
Deep learning has achieved remarkable success in numerous domains with help from large
amounts of big data. However, the quality of data labels is a concern because of the lack of …

Trustworthy AI: From principles to practices

B Li, P Qi, B Liu, S Di, J Liu, J Pei, J Yi… - ACM Computing Surveys, 2023 - dl.acm.org
The rapid development of Artificial Intelligence (AI) technology has enabled the deployment
of various systems based on it. However, many current AI systems are found vulnerable to …

Image classification with deep learning in the presence of noisy labels: A survey

G Algan, I Ulusoy - Knowledge-Based Systems, 2021 - Elsevier
Image classification systems recently made a giant leap with the advancement of deep
neural networks. However, these systems require an excessive amount of labeled data to be …

Generalization in deep learning

K Kawaguchi, LP Kaelbling, Y Bengio - arxiv preprint arxiv …, 2017 - cambridge.org
This chapter provides theoretical insights into why and how deep learning can generalize
well, despite its large capacity, complexity, possible algorithmic instability, non-robustness …

Dimensionality-driven learning with noisy labels

X Ma, Y Wang, ME Houle, S Zhou… - International …, 2018 - proceedings.mlr.press
Datasets with significant proportions of noisy (incorrect) class labels present challenges for
training accurate Deep Neural Networks (DNNs). We propose a new perspective for …

A bayesian perspective on generalization and stochastic gradient descent

SL Smith, QV Le - arxiv preprint arxiv:1710.06451, 2017 - arxiv.org
We consider two questions at the heart of machine learning; how can we predict if a
minimum will generalize to the test set, and why does stochastic gradient descent find …

[HTML][HTML] HESS Opinions: Incubating deep-learning-powered hydrologic science advances as a community

C Shen, E Laloy, A Elshorbagy, A Albert… - Hydrology and Earth …, 2018 - hess.copernicus.org
Recently, deep learning (DL) has emerged as a revolutionary and versatile tool transforming
industry applications and generating new and improved capabilities for scientific discovery …

Implicit self-regularization in deep neural networks: Evidence from random matrix theory and implications for learning

CH Martin, MW Mahoney - Journal of Machine Learning Research, 2021 - jmlr.org
Random Matrix Theory (RMT) is applied to analyze the weight matrices of Deep Neural
Networks (DNNs), including both production quality, pre-trained models such as AlexNet …

An information-theoretic perspective on overfitting and underfitting

D Bashir, GD Montañez, S Sehra, PS Segura… - AI 2020: Advances in …, 2020 - Springer
We present an information-theoretic framework for understanding overfitting and underfitting
in machine learning and prove the formal undecidability of determining whether an arbitrary …

Label noise types and their effects on deep learning

G Algan, I Ulusoy - arxiv preprint arxiv:2003.10471, 2020 - arxiv.org
The recent success of deep learning is mostly due to the availability of big datasets with
clean annotations. However, gathering a cleanly annotated dataset is not always feasible …