Semantic probabilistic layers for neuro-symbolic learning
We design a predictive layer for structured-output prediction (SOP) that can be plugged into
any neural network guaranteeing its predictions are consistent with a set of predefined …
any neural network guaranteeing its predictions are consistent with a set of predefined …
Out-of-distribution generalization by neural-symbolic joint training
This paper develops a novel methodology to simultaneously learn a neural network and
extract generalized logic rules. Different from prior neural-symbolic methods that require …
extract generalized logic rules. Different from prior neural-symbolic methods that require …
Continuous mixtures of tractable probabilistic models
Probabilistic models based on continuous latent spaces, such as variational autoencoders,
can be understood as uncountable mixture models where components depend continuously …
can be understood as uncountable mixture models where components depend continuously …
Understanding the distillation process from deep generative models to tractable probabilistic circuits
Abstract Probabilistic Circuits (PCs) are a general and unified computational framework for
tractable probabilistic models that support efficient computation of various inference tasks …
tractable probabilistic models that support efficient computation of various inference tasks …
Sparse probabilistic circuits via pruning and growing
Probabilistic circuits (PCs) are a tractable representation of probability distributions allowing
for exact and efficient computation of likelihoods and marginals. There has been significant …
for exact and efficient computation of likelihoods and marginals. There has been significant …
Scaling up probabilistic circuits by latent variable distillation
Probabilistic Circuits (PCs) are a unified framework for tractable probabilistic models that
support efficient computation of various probabilistic queries (eg, marginal probabilities) …
support efficient computation of various probabilistic queries (eg, marginal probabilities) …
Probabilistic integral circuits
Continuous latent variables (LVs) are a key ingredient of many generative models, as they
allow modelling expressive mixtures with an uncountable number of components. In …
allow modelling expressive mixtures with an uncountable number of components. In …
Lossless compression with probabilistic circuits
Despite extensive progress on image generation, common deep generative model
architectures are not easily applied to lossless compression. For example, VAEs suffer from …
architectures are not easily applied to lossless compression. For example, VAEs suffer from …
What is the Relationship between Tensor Factorizations and Circuits (and How Can We Exploit it)?
This paper establishes a rigorous connection between circuit representations and tensor
factorizations, two seemingly distinct yet fundamentally related areas. By connecting these …
factorizations, two seemingly distinct yet fundamentally related areas. By connecting these …
Hyperspns: Compact and expressive probabilistic circuits
Probabilistic circuits (PCs) are a family of generative models which allows for the
computation of exact likelihoods and marginals of its probability distributions. PCs are both …
computation of exact likelihoods and marginals of its probability distributions. PCs are both …