Learning from noisy labels with deep neural networks: A survey

H Song, M Kim, D Park, Y Shin… - IEEE transactions on …, 2022 - ieeexplore.ieee.org
Deep learning has achieved remarkable success in numerous domains with help from large
amounts of big data. However, the quality of data labels is a concern because of the lack of …

Shortcut learning in deep neural networks

R Geirhos, JH Jacobsen, C Michaelis… - Nature Machine …, 2020 - nature.com
Deep learning has triggered the current rise of artificial intelligence and is the workhorse of
today's machine intelligence. Numerous success stories have rapidly spread all over …

Flexmatch: Boosting semi-supervised learning with curriculum pseudo labeling

B Zhang, Y Wang, W Hou, H Wu… - Advances in …, 2021 - proceedings.neurips.cc
The recently proposed FixMatch achieved state-of-the-art results on most semi-supervised
learning (SSL) benchmarks. However, like other modern SSL algorithms, FixMatch uses a …

Bootstrap your own latent-a new approach to self-supervised learning

JB Grill, F Strub, F Altché, C Tallec… - Advances in neural …, 2020 - proceedings.neurips.cc
Abstract We introduce Bootstrap Your Own Latent (BYOL), a new approach to self-
supervised image representation learning. BYOL relies on two neural networks, referred to …

Graph contrastive learning with augmentations

Y You, T Chen, Y Sui, T Chen… - Advances in neural …, 2020 - proceedings.neurips.cc
Generalizable, transferrable, and robust representation learning on graph-structured data
remains a challenge for current graph neural networks (GNNs). Unlike what has been …

A simple framework for contrastive learning of visual representations

T Chen, S Kornblith, M Norouzi… - … conference on machine …, 2020 - proceedings.mlr.press
This paper presents SimCLR: a simple framework for contrastive learning of visual
representations. We simplify recently proposed contrastive self-supervised learning …

Fixmatch: Simplifying semi-supervised learning with consistency and confidence

K Sohn, D Berthelot, N Carlini… - Advances in neural …, 2020 - proceedings.neurips.cc
Semi-supervised learning (SSL) provides an effective means of leveraging unlabeled data
to improve a model's performance. This domain has seen fast progress recently, at the cost …

Semi-supervised semantic segmentation with cross pseudo supervision

X Chen, Y Yuan, G Zeng… - Proceedings of the IEEE …, 2021 - openaccess.thecvf.com
In this paper, we study the semi-supervised semantic segmentation problem via exploring
both labeled data and extra unlabeled data. We propose a novel consistency regularization …

Big self-supervised models are strong semi-supervised learners

T Chen, S Kornblith, K Swersky… - Advances in neural …, 2020 - proceedings.neurips.cc
One paradigm for learning from few labeled examples while making best use of a large
amount of unlabeled data is unsupervised pretraining followed by supervised fine-tuning …

Vicreg: Variance-invariance-covariance regularization for self-supervised learning

A Bardes, J Ponce, Y LeCun - arxiv preprint arxiv:2105.04906, 2021 - arxiv.org
Recent self-supervised methods for image representation learning are based on maximizing
the agreement between embedding vectors from different views of the same image. A trivial …