Coordinate Independent Convolutional Networks--Isometry and Gauge Equivariant Convolutions on Riemannian Manifolds

M Weiler, P Forré, E Verlinde, M Welling - arxiv preprint arxiv:2106.06020, 2021 - arxiv.org
Motivated by the vast success of deep convolutional networks, there is a great interest in
generalizing convolutions to non-Euclidean manifolds. A major complication in comparison …

Variational integrator networks for physically structured embeddings

S Saemundsson, A Terenin… - International …, 2020 - proceedings.mlr.press
Learning workable representations of dynamical systems is becoming an increasingly
important problem in a number of application areas. By leveraging recent work connecting …

Reparameterizing distributions on lie groups

L Falorsi, P de Haan, TR Davidson… - The 22nd International …, 2019 - proceedings.mlr.press
Reparameterizable densities are an important way to learn probability distributions in a
deep learning setting. For many distributions it is possible to create low-variance gradient …

Diffusion variational autoencoders

LAP Rey, V Menkovski, JW Portegies - arxiv preprint arxiv:1901.08991, 2019 - arxiv.org
A standard Variational Autoencoder, with a Euclidean latent space, is structurally incapable
of capturing topological properties of certain datasets. To remove topological obstructions …

Topological obstructions and how to avoid them

B Esmaeili, R Walters, H Zimmermann… - Advances in …, 2024 - proceedings.neurips.cc
Incorporating geometric inductive biases into models can aid interpretability and
generalization, but encoding to a specific geometric structure can be challenging due to the …

Topological degree as a discrete diagnostic for disentanglement, with applications to the VAE

MR Ravelonanosy, V Menkovski… - arxiv preprint arxiv …, 2024 - arxiv.org
We investigate the ability of Diffusion Variational Autoencoder ($\Delta $ VAE) with unit
sphere $\mathcal {S}^ 2$ as latent space to capture topological and geometrical structure …

Deep latent variable models for text modelling

R Li - 2021 - etheses.whiterose.ac.uk
Deep latent variable models is a class of models that parameterise components of
probabilistic latent variable models with neural networks. This class of models can capture …

Understanding Optimization Challenges when Encoding to Geometric Structures

B Esmaeili, R Walters, H Zimmermann… - … 2022 Workshop on …, 2022 - openreview.net
Geometric inductive biases such as spatial curvature, factorizability, or equivariance have
been shown to enable learning of latent spaces which better reflect the structure of data and …

[PDF][PDF] COORDINATE INDEPENDENT CONVOLUTIONAL NETWORKS

Motivated by the vast success of deep convolutional networks, there is a great interest in
generalizing convolutions to non-Euclidean manifolds. A major complication in comparison …

Advanced embodied learning

FMJ Walter - 2021 - mediatum.ub.tum.de
This work introduces new learning methods based on neurorobotics. We develop a tool set
that enables massively parallel neurorobotics experiments in the cloud and supports …