A comprehensive survey on test-time adaptation under distribution shifts

J Liang, R He, T Tan - International Journal of Computer Vision, 2024 - Springer
Abstract Machine learning methods strive to acquire a robust model during the training
process that can effectively generalize to test samples, even in the presence of distribution …

Deep clustering: A comprehensive survey

Y Ren, J Pu, Z Yang, J Xu, G Li, X Pu… - IEEE transactions on …, 2024 - ieeexplore.ieee.org
Cluster analysis plays an indispensable role in machine learning and data mining. Learning
a good data representation is crucial for clustering algorithms. Recently, deep clustering …

Self-supervised learning from images with a joint-embedding predictive architecture

M Assran, Q Duval, I Misra… - Proceedings of the …, 2023 - openaccess.thecvf.com
This paper demonstrates an approach for learning highly semantic image representations
without relying on hand-crafted data-augmentations. We introduce the Image-based Joint …

Unsupervised data augmentation for consistency training

Q **e, Z Dai, E Hovy, T Luong… - Advances in neural …, 2020 - proceedings.neurips.cc
Semi-supervised learning lately has shown much promise in improving deep learning
models when labeled data is scarce. Common among recent approaches is the use of …

Learning deep representations by mutual information estimation and maximization

RD Hjelm, A Fedorov, S Lavoie-Marchildon… - arxiv preprint arxiv …, 2018 - arxiv.org
In this work, we perform unsupervised learning of representations by maximizing mutual
information between an input and the output of a deep neural network encoder. Importantly …

Do we really need to access the source data? source hypothesis transfer for unsupervised domain adaptation

J Liang, D Hu, J Feng - International conference on machine …, 2020 - proceedings.mlr.press
Unsupervised domain adaptation (UDA) aims to leverage the knowledge learned from a
labeled source dataset to solve similar tasks in a new unlabeled domain. Prior UDA …

Contrastive clustering

Y Li, P Hu, Z Liu, D Peng, JT Zhou… - Proceedings of the AAAI …, 2021 - ojs.aaai.org
In this paper, we propose an online clustering method called Contrastive Clustering (CC)
which explicitly performs the instance-and cluster-level contrastive learning. To be specific …

Invariant information clustering for unsupervised image classification and segmentation

X Ji, JF Henriques, A Vedaldi - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com
We present a novel clustering objective that learns a neural network classifier from scratch,
given only unlabelled data samples. The model discovers clusters that accurately match …

Scan: Learning to classify images without labels

W Van Gansbeke, S Vandenhende… - European conference on …, 2020 - Springer
Can we automatically group images into semantically meaningful clusters when ground-
truth annotations are absent? The task of unsupervised image classification remains an …

On mutual information maximization for representation learning

M Tschannen, J Djolonga, PK Rubenstein… - arxiv preprint arxiv …, 2019 - arxiv.org
Many recent methods for unsupervised or self-supervised representation learning train
feature extractors by maximizing an estimate of the mutual information (MI) between different …