To compress or not to compress—self-supervised learning and information theory: A review

R Shwartz Ziv, Y LeCun - Entropy, 2024 - mdpi.com
Deep neural networks excel in supervised learning tasks but are constrained by the need for
extensive labeled data. Self-supervised learning emerges as a promising alternative …

Machine learning for active matter

F Cichos, K Gustavsson, B Mehlig… - Nature Machine …, 2020 - nature.com
The availability of large datasets has boosted the application of machine learning in many
fields and is now starting to shape active-matter research as well. Machine learning …

Data-efficient image recognition with contrastive predictive coding

O Henaff - International conference on machine learning, 2020 - proceedings.mlr.press
Human observers can learn to recognize new categories of images from a handful of
examples, yet doing so with artificial ones remains an open challenge. We hypothesize that …

Deep variational information bottleneck

AA Alemi, I Fischer, JV Dillon, K Murphy - arxiv preprint arxiv:1612.00410, 2016 - arxiv.org
We present a variational approximation to the information bottleneck of Tishby et al.(1999).
This variational approach allows us to parameterize the information bottleneck model using …

Unsupervised state representation learning in atari

A Anand, E Racah, S Ozair, Y Bengio… - Advances in neural …, 2019 - proceedings.neurips.cc
State representation learning, or the ability to capture latent generative factors of an
environment is crucial for building intelligent agents that can perform a wide variety of tasks …

Hierarchical process memory: memory as an integral component of information processing

U Hasson, J Chen, CJ Honey - Trends in cognitive sciences, 2015 - cell.com
Models of working memory (WM) commonly focus on how information is encoded into and
retrieved from storage at specific moments. However, in the majority of real-life processes …

Efficient compression in color naming and its evolution

N Zaslavsky, C Kemp, T Regier, N Tishby - Proceedings of the National …, 2018 - pnas.org
We derive a principled information-theoretic account of cross-language semantic variation.
Specifically, we argue that languages efficiently compress ideas into words by optimizing the …

Self-supervised video pretraining yields robust and more human-aligned visual representations

N Parthasarathy, SM Eslami… - Advances in Neural …, 2023 - proceedings.neurips.cc
Humans learn powerful representations of objects and scenes by observing how they evolve
over time. Yet, outside of specific tasks that require explicit temporal understanding, static …

Is coding a relevant metaphor for the brain?

R Brette - Behavioral and Brain Sciences, 2019 - cambridge.org
“Neural coding” is a popular metaphor in neuroscience, where objective properties of the
world are communicated to the brain in the form of spikes. Here I argue that this metaphor is …

Theory of cortical function

DJ Heeger - Proceedings of the National Academy of Sciences, 2017 - pnas.org
Most models of sensory processing in the brain have a feedforward architecture in which
each stage comprises simple linear filtering operations and nonlinearities. Models of this …