To compress or not to compress—self-supervised learning and information theory: A review

R Shwartz Ziv, Y LeCun - Entropy, 2024 - mdpi.com
Deep neural networks excel in supervised learning tasks but are constrained by the need for
extensive labeled data. Self-supervised learning emerges as a promising alternative …

Self-supervised learning of representations for space generates multi-modular grid cells

R Schaeffer, M Khona, T Ma… - Advances in …, 2024 - proceedings.neurips.cc
To solve the spatial problems of map**, localization and navigation, the mammalian
lineage has developed striking spatial representations. One important spatial representation …

Learning efficient coding of natural images with maximum manifold capacity representations

T Yerxa, Y Kuang, E Simoncelli… - Advances in Neural …, 2023 - proceedings.neurips.cc
The efficient coding hypothesis proposes that the response properties of sensory systems
are adapted to the statistics of their inputs such that they capture maximal information about …

Matrix information theory for self-supervised learning

Y Zhang, Z Tan, J Yang, W Huang, Y Yuan - arxiv preprint arxiv …, 2023 - arxiv.org
The maximum entropy encoding framework provides a unified perspective for many non-
contrastive learning methods like SimSiam, Barlow Twins, and MEC. Inspired by this …

Human vs ChatGPT: Effect of Data Annotation in Interpretable Crisis-Related Microblog Classification

TH Nguyen, K Rudra - Proceedings of the ACM on Web Conference …, 2024 - dl.acm.org
Recent studies have exploited the vital role of microblogging platforms, such as Twitter, in
crisis situations. Various machine-learning approaches have been proposed to identify and …

Gaussian Mutual Information Maximization for Efficient Graph Self-Supervised Learning: Bridging Contrastive-based to Decorrelation-based

J Wen - Proceedings of the 32nd ACM International …, 2024 - dl.acm.org
Enlightened by the InfoMax principle, Graph Contrastive Learning (GCL) has achieved
remarkable performance in processing large amounts of unlabeled graph data. Due to the …

CroMo-Mixup: Augmenting Cross-Model Representations for Continual Self-Supervised Learning

E Mushtaq, DN Yaldiz, YF Bakman, J Ding… - … on Computer Vision, 2024 - Springer
Continual self-supervised learning (CSSL) learns a series of tasks sequentially on the
unlabeled data. Two main challenges of continual learning are catastrophic forgetting and …

On the Generalization and Causal Explanation in Self-Supervised Learning

W Qiang, Z Song, Z Gu, J Li, C Zheng, F Sun… - International Journal of …, 2024 - Springer
Self-supervised learning (SSL) methods learn from unlabeled data and achieve high
generalization performance on downstream tasks. However, they may also suffer from …

Learning symbolic representations through joint generative and discriminative training

E Sansone, R Manhaeve - arxiv preprint arxiv:2304.11357, 2023 - arxiv.org
We introduce GEDI, a Bayesian framework that combines existing self-supervised learning
objectives with likelihood-based generative models. This framework leverages the benefits …

On the Discriminability of Self-Supervised Representation Learning

Z Song, W Qiang, C Zheng, F Sun, H **ong - arxiv preprint arxiv …, 2024 - arxiv.org
Self-supervised learning (SSL) has recently achieved significant success in downstream
visual tasks. However, a notable gap still exists between SSL and supervised learning (SL) …