To compress or not to compress—self-supervised learning and information theory: A review
Deep neural networks excel in supervised learning tasks but are constrained by the need for
extensive labeled data. Self-supervised learning emerges as a promising alternative …
extensive labeled data. Self-supervised learning emerges as a promising alternative …
Self-supervised learning of representations for space generates multi-modular grid cells
To solve the spatial problems of map**, localization and navigation, the mammalian
lineage has developed striking spatial representations. One important spatial representation …
lineage has developed striking spatial representations. One important spatial representation …
Learning efficient coding of natural images with maximum manifold capacity representations
The efficient coding hypothesis proposes that the response properties of sensory systems
are adapted to the statistics of their inputs such that they capture maximal information about …
are adapted to the statistics of their inputs such that they capture maximal information about …
Matrix information theory for self-supervised learning
The maximum entropy encoding framework provides a unified perspective for many non-
contrastive learning methods like SimSiam, Barlow Twins, and MEC. Inspired by this …
contrastive learning methods like SimSiam, Barlow Twins, and MEC. Inspired by this …
Human vs ChatGPT: Effect of Data Annotation in Interpretable Crisis-Related Microblog Classification
Recent studies have exploited the vital role of microblogging platforms, such as Twitter, in
crisis situations. Various machine-learning approaches have been proposed to identify and …
crisis situations. Various machine-learning approaches have been proposed to identify and …
Gaussian Mutual Information Maximization for Efficient Graph Self-Supervised Learning: Bridging Contrastive-based to Decorrelation-based
J Wen - Proceedings of the 32nd ACM International …, 2024 - dl.acm.org
Enlightened by the InfoMax principle, Graph Contrastive Learning (GCL) has achieved
remarkable performance in processing large amounts of unlabeled graph data. Due to the …
remarkable performance in processing large amounts of unlabeled graph data. Due to the …
CroMo-Mixup: Augmenting Cross-Model Representations for Continual Self-Supervised Learning
Continual self-supervised learning (CSSL) learns a series of tasks sequentially on the
unlabeled data. Two main challenges of continual learning are catastrophic forgetting and …
unlabeled data. Two main challenges of continual learning are catastrophic forgetting and …
On the Generalization and Causal Explanation in Self-Supervised Learning
Self-supervised learning (SSL) methods learn from unlabeled data and achieve high
generalization performance on downstream tasks. However, they may also suffer from …
generalization performance on downstream tasks. However, they may also suffer from …
Learning symbolic representations through joint generative and discriminative training
We introduce GEDI, a Bayesian framework that combines existing self-supervised learning
objectives with likelihood-based generative models. This framework leverages the benefits …
objectives with likelihood-based generative models. This framework leverages the benefits …
On the Discriminability of Self-Supervised Representation Learning
Self-supervised learning (SSL) has recently achieved significant success in downstream
visual tasks. However, a notable gap still exists between SSL and supervised learning (SL) …
visual tasks. However, a notable gap still exists between SSL and supervised learning (SL) …