A cookbook of self-supervised learning

R Balestriero, M Ibrahim, V Sobal, A Morcos… - arxiv preprint arxiv …, 2023 - arxiv.org
Self-supervised learning, dubbed the dark matter of intelligence, is a promising path to
advance machine learning. Yet, much like cooking, training SSL methods is a delicate art …

Hebbian deep learning without feedback

A Journé, HG Rodriguez, Q Guo, T Moraitis - arxiv preprint arxiv …, 2022 - arxiv.org
Recent approximations to backpropagation (BP) have mitigated many of BP's computational
inefficiencies and incompatibilities with biology, but important limitations still remain …

Self-supervised learning of split invariant equivariant representations

Q Garrido, L Najman, Y Lecun - arxiv preprint arxiv:2302.10283, 2023 - arxiv.org
Recent progress has been made towards learning invariant or equivariant representations
with self-supervised learning. While invariant methods are evaluated on large scale …

Softhebb: Bayesian inference in unsupervised hebbian soft winner-take-all networks

T Moraitis, D Toichkin, A Journé… - Neuromorphic …, 2022 - iopscience.iop.org
Hebbian plasticity in winner-take-all (WTA) networks is highly attractive for neuromorphic on-
chip learning, owing to its efficient, local, unsupervised, and on-line nature. Moreover, its …

Gedi: Generative and discriminative training for self-supervised learning

E Sansone, R Manhaeve - arxiv preprint arxiv:2212.13425, 2022 - arxiv.org
Self-supervised learning is a popular and powerful method for utilizing large amounts of
unlabeled data, for which a wide variety of training objectives have been proposed in the …

Failure-Proof Non-Contrastive Self-Supervised Learning

E Sansone, T Lebailly, T Tuytelaars - arxiv preprint arxiv:2410.04959, 2024 - arxiv.org
We identify sufficient conditions to avoid known failure modes, including representation,
dimensional, cluster and intracluster collapses, occurring in non-contrastive self-supervised …

PseudoNeg-MAE: Self-Supervised Point Cloud Learning using Conditional Pseudo-Negative Embeddings

S Mahendren, S Rahman, P Koniusz… - arxiv preprint arxiv …, 2024 - arxiv.org
We propose PseudoNeg-MAE, a novel self-supervised learning framework that enhances
global feature representation of point cloud mask autoencoder by making them both …

Is Encoded Popularity Always Harmful? Explicit Debiasing with Augmentation for Contrastive Collaborative Filtering

G Chen, W Qiang, Y Ouyang, C Yin… - 2024 International Joint …, 2024 - ieeexplore.ieee.org
Collaborative Filtering (CF) models based on Graph Contrastive Learning (GCL) have
effectively improved the performance of long-tail recommendation. However, the popularity …

A Bayesian Unification of Self-Supervised Clustering and Energy-Based Models

E Sansone, R Manhaeve - arxiv preprint arxiv:2401.00873, 2023 - arxiv.org
Self-supervised learning is a popular and powerful method for utilizing large amounts of
unlabeled data, for which a wide variety of training objectives have been proposed in the …

[PDF][PDF] Unifying Self-Supervised Clustering and Energy-Based Models

E Sansone, R Manhaeve - arxiv preprint arxiv:2401.00873, 2024 - researchgate.net
Self-supervised learning excels at learning representations from large amounts of data. At
the same time, generative models offer the complementary property of learning information …