On the stepwise nature of self-supervised learning
We present a simple picture of the training process of self-supervised learning methods with
dual deep networks. In our picture, these methods learn their high-dimensional embeddings …
dual deep networks. In our picture, these methods learn their high-dimensional embeddings …
Complementary benefits of contrastive learning and self-training under distribution shift
Self-training and contrastive learning have emerged as leading techniques for incorporating
unlabeled data, both under distribution shift (unsupervised domain adaptation) and when it …
unlabeled data, both under distribution shift (unsupervised domain adaptation) and when it …
Why the simplest explanation isn't always the best
As datasets in neuroscience increase in size and complexity, interpreting these high-
dimensional data is becoming more critical. However, develo** an intuition for patterns or …
dimensional data is becoming more critical. However, develo** an intuition for patterns or …
Active self-supervised learning: A few low-cost relationships are all you need
Abstract Self-Supervised Learning (SSL) has emerged as the solution of choice to learn
transferable representations from unlabeled data. However, SSL requires to build samples …
transferable representations from unlabeled data. However, SSL requires to build samples …
Self-supervised learning with lie symmetries for partial differential equations
Abstract Machine learning for differential equations paves the way for computationally
efficient alternatives to numerical solvers, with potentially broad impacts in science and …
efficient alternatives to numerical solvers, with potentially broad impacts in science and …
Information flow in self-supervised learning
In this paper, we provide a comprehensive toolbox for understanding and enhancing self-
supervised learning (SSL) methods through the lens of matrix information theory …
supervised learning (SSL) methods through the lens of matrix information theory …
Towards open-world recognition: Critical problems and challenges
K Wang, Z Li, Y Chen, W Dong, J Chen - Engineering Applications of …, 2025 - Elsevier
With the emergence of rich classification models and high computing power, recognition
systems are widely used in various fields. Unfortunately, as the scale of open systems …
systems are widely used in various fields. Unfortunately, as the scale of open systems …
Disentangling Masked Autoencoders for Unsupervised Domain Generalization
Abstract Domain Generalization (DG), designed to enhance out-of-distribution (OOD)
generalization, is all about learning invariance against domain shifts utilizing sufficient …
generalization, is all about learning invariance against domain shifts utilizing sufficient …
Matrix information theory for self-supervised learning
The maximum entropy encoding framework provides a unified perspective for many non-
contrastive learning methods like SimSiam, Barlow Twins, and MEC. Inspired by this …
contrastive learning methods like SimSiam, Barlow Twins, and MEC. Inspired by this …
Memorization in self-supervised learning improves downstream generalization
Self-supervised learning (SSL) has recently received significant attention due to its ability to
train high-performance encoders purely on unlabeled data-often scraped from the internet …
train high-performance encoders purely on unlabeled data-often scraped from the internet …