Semantic probabilistic layers for neuro-symbolic learning

K Ahmed, S Teso, KW Chang… - Advances in …, 2022 - proceedings.neurips.cc
We design a predictive layer for structured-output prediction (SOP) that can be plugged into
any neural network guaranteeing its predictions are consistent with a set of predefined …

Out-of-distribution generalization by neural-symbolic joint training

A Liu, H Xu, G Van den Broeck, Y Liang - Proceedings of the AAAI …, 2023 - ojs.aaai.org
This paper develops a novel methodology to simultaneously learn a neural network and
extract generalized logic rules. Different from prior neural-symbolic methods that require …

Continuous mixtures of tractable probabilistic models

AHC Correia, G Gala, E Quaeghebeur… - Proceedings of the …, 2023 - ojs.aaai.org
Probabilistic models based on continuous latent spaces, such as variational autoencoders,
can be understood as uncountable mixture models where components depend continuously …

Understanding the distillation process from deep generative models to tractable probabilistic circuits

X Liu, A Liu, G Van den Broeck… - … Conference on Machine …, 2023 - proceedings.mlr.press
Abstract Probabilistic Circuits (PCs) are a general and unified computational framework for
tractable probabilistic models that support efficient computation of various inference tasks …

Sparse probabilistic circuits via pruning and growing

M Dang, A Liu… - Advances in Neural …, 2022 - proceedings.neurips.cc
Probabilistic circuits (PCs) are a tractable representation of probability distributions allowing
for exact and efficient computation of likelihoods and marginals. There has been significant …

Scaling up probabilistic circuits by latent variable distillation

A Liu, H Zhang, GV Broeck - arxiv preprint arxiv:2210.04398, 2022 - arxiv.org
Probabilistic Circuits (PCs) are a unified framework for tractable probabilistic models that
support efficient computation of various probabilistic queries (eg, marginal probabilities) …

Probabilistic integral circuits

G Gala, C de Campos, R Peharz… - International …, 2024 - proceedings.mlr.press
Continuous latent variables (LVs) are a key ingredient of many generative models, as they
allow modelling expressive mixtures with an uncountable number of components. In …

Lossless compression with probabilistic circuits

A Liu, S Mandt, GV Broeck - arxiv preprint arxiv:2111.11632, 2021 - arxiv.org
Despite extensive progress on image generation, common deep generative model
architectures are not easily applied to lossless compression. For example, VAEs suffer from …

What is the Relationship between Tensor Factorizations and Circuits (and How Can We Exploit it)?

L Loconte, A Mari, G Gala, R Peharz… - arxiv preprint arxiv …, 2024 - arxiv.org
This paper establishes a rigorous connection between circuit representations and tensor
factorizations, two seemingly distinct yet fundamentally related areas. By connecting these …

Hyperspns: Compact and expressive probabilistic circuits

A Shih, D Sadigh, S Ermon - Advances in Neural …, 2021 - proceedings.neurips.cc
Probabilistic circuits (PCs) are a family of generative models which allows for the
computation of exact likelihoods and marginals of its probability distributions. PCs are both …