Nonparametric identifiability of causal representations from unknown interventions
We study causal representation learning, the task of inferring latent causal variables and
their causal relations from high-dimensional functions (“mixtures”) of the variables. Prior …
their causal relations from high-dimensional functions (“mixtures”) of the variables. Prior …
Causal representation learning through higher-level information extraction
The large gap between the generalization level of state-of-the-art machine learning and
human learning systems calls for the development of artificial intelligence (AI) models that …
human learning systems calls for the development of artificial intelligence (AI) models that …
Nonparametric partial disentanglement via mechanism sparsity: Sparse actions, interventions and sparse temporal dependencies
This work introduces a novel principle for disentanglement we call mechanism sparsity
regularization, which applies when the latent factors of interest depend sparsely on …
regularization, which applies when the latent factors of interest depend sparsely on …
Towards interpretable Cryo-EM: disentangling latent spaces of molecular conformations
Molecules are essential building blocks of life and their different conformations (ie, shapes)
crucially determine the functional role that they play in living organisms. Cryogenic Electron …
crucially determine the functional role that they play in living organisms. Cryogenic Electron …
Score-based causal representation learning: Linear and general transformations
This paper addresses intervention-based causal representation learning (CRL) under a
general nonparametric latent causal model and an unknown transformation that maps the …
general nonparametric latent causal model and an unknown transformation that maps the …
A sparsity principle for partially observable causal representation learning
Causal representation learning aims at identifying high-level causal variables from
perceptual data. Most methods assume that all latent causal variables are captured in the …
perceptual data. Most methods assume that all latent causal variables are captured in the …
Identifiability guarantees for causal disentanglement from purely observational data
Causal disentanglement aims to learn about latent causal factors behind data, holding the
promise to augment existing representation learning methods in terms of interpretability and …
promise to augment existing representation learning methods in terms of interpretability and …
Interaction Asymmetry: A General Principle for Learning Composable Abstractions
Learning disentangled representations of concepts and re-composing them in unseen ways
is crucial for generalizing to out-of-domain situations. However, the underlying properties of …
is crucial for generalizing to out-of-domain situations. However, the underlying properties of …
Unsupervised discovery of the shared and private geometry in multi-view data
Modern applications often leverage multiple views of a subject of study. Within
neuroscience, there is growing interest in large-scale simultaneous recordings across …
neuroscience, there is growing interest in large-scale simultaneous recordings across …
Revealing Multimodal Contrastive Representation Learning through Latent Partial Causal Models
Multimodal contrastive representation learning methods have proven successful across a
range of domains, partly due to their ability to generate meaningful shared representations …
range of domains, partly due to their ability to generate meaningful shared representations …