Semantic probabilistic layers for neuro-symbolic learning
We design a predictive layer for structured-output prediction (SOP) that can be plugged into
any neural network guaranteeing its predictions are consistent with a set of predefined …
any neural network guaranteeing its predictions are consistent with a set of predefined …
On the expressive power of deep learning: A tensor analysis
It has long been conjectured that hypotheses spaces suitable for data that is compositional
in nature, such as text or images, may be more efficiently represented with deep hierarchical …
in nature, such as text or images, may be more efficiently represented with deep hierarchical …
Einsum networks: Fast and scalable learning of tractable probabilistic circuits
Probabilistic circuits (PCs) are a promising avenue for probabilistic modeling, as they permit
a wide range of exact and efficient inference routines. Recent “deep-learning-style” …
a wide range of exact and efficient inference routines. Recent “deep-learning-style” …
Random sum-product networks: A simple and effective approach to probabilistic deep learning
Sum-product networks (SPNs) are expressive probabilistic models with a rich set of exact
and efficient inference routines. However, in order to guarantee exact inference, they require …
and efficient inference routines. However, in order to guarantee exact inference, they require …
Mixed sum-product networks: A deep architecture for hybrid domains
While all kinds of mixed data---from personal data, over panel and scientific data, to public
and commercial data---are collected and stored, building probabilistic graphical models for …
and commercial data---are collected and stored, building probabilistic graphical models for …
On the latent variable interpretation in sum-product networks
One of the central themes in Sum-Product networks (SPNs) is the interpretation of sum
nodes as marginalized latent variables (LVs). This interpretation yields an increased …
nodes as marginalized latent variables (LVs). This interpretation yields an increased …
Simplifying, regularizing and strengthening sum-product network structure learning
The need for feasible inference in Probabilistic Graphical Models (PGMs) has lead to
tractable models like Sum-Product Networks (SPNs). Their highly expressive power and …
tractable models like Sum-Product Networks (SPNs). Their highly expressive power and …
A survey of sum–product networks structural learning
Sum–product networks (SPNs) in deep probabilistic models have made great progress in
computer vision, robotics, neuro-symbolic artificial intelligence, natural language …
computer vision, robotics, neuro-symbolic artificial intelligence, natural language …
Continuous mixtures of tractable probabilistic models
Probabilistic models based on continuous latent spaces, such as variational autoencoders,
can be understood as uncountable mixture models where components depend continuously …
can be understood as uncountable mixture models where components depend continuously …
On the relationship between sum-product networks and Bayesian networks
In this paper, we establish some theoretical connections between Sum-Product Networks
(SPNs) and Bayesian Networks (BNs). We prove that every SPN can be converted into a BN …
(SPNs) and Bayesian Networks (BNs). We prove that every SPN can be converted into a BN …