Tractable control for autoregressive language generation
Despite the success of autoregressive large language models in text generation, it remains
a major challenge to generate text that satisfies complex constraints: sampling from the …
a major challenge to generate text that satisfies complex constraints: sampling from the …
A-nesi: A scalable approximate method for probabilistic neurosymbolic inference
We study the problem of combining neural networks with symbolic reasoning. Recently
introduced frameworks for Probabilistic Neurosymbolic Learning (PNL), such as …
introduced frameworks for Probabilistic Neurosymbolic Learning (PNL), such as …
Not all neuro-symbolic concepts are created equal: Analysis and mitigation of reasoning shortcuts
Abstract Neuro-Symbolic (NeSy) predictive models hold the promise of improved
compliance with given constraints, systematic generalization, and interpretability, as they …
compliance with given constraints, systematic generalization, and interpretability, as they …
Semantic strengthening of neuro-symbolic learning
Numerous neuro-symbolic approaches have recently been proposed typically with the goal
of adding symbolic knowledge to the output layer of a neural network. Ideally, such losses …
of adding symbolic knowledge to the output layer of a neural network. Ideally, such losses …
Safe reinforcement learning via probabilistic logic shields
Safe Reinforcement learning (Safe RL) aims at learning optimal policies while staying safe.
A popular solution to Safe RL is shielding, which uses a logical safety specification to …
A popular solution to Safe RL is shielding, which uses a logical safety specification to …
A Unified Approach to Count-Based Weakly Supervised Learning
High-quality labels are often very scarce, whereas unlabeled data with inferred weak labels
occurs more naturally. In many cases, these weak labels dictate the frequency of each …
occurs more naturally. In many cases, these weak labels dictate the frequency of each …
Neuro-symbolic continual learning: Knowledge, reasoning shortcuts and concept rehearsal
We introduce Neuro-Symbolic Continual Learning, where a model has to solve a sequence
of neuro-symbolic tasks, that is, it has to map sub-symbolic inputs to high-level concepts and …
of neuro-symbolic tasks, that is, it has to map sub-symbolic inputs to high-level concepts and …
[HTML][HTML] CCN+: A neuro-symbolic framework for deep learning with requirements
For their outstanding ability of finding hidden patterns in data, deep learning models have
been extensively applied in many different domains. However, recent works have shown …
been extensively applied in many different domains. However, recent works have shown …
Collapsed inference for bayesian deep learning
Bayesian neural networks (BNNs) provide a formalism to quantify and calibrate uncertainty
in deep learning. Current inference approaches for BNNs often resort to few-sample …
in deep learning. Current inference approaches for BNNs often resort to few-sample …
A pseudo-semantic loss for autoregressive models with logical constraints
Neuro-symbolic AI bridges the gap between purely symbolic and neural approaches to
learning. This often requires maximizing the likelihood of a symbolic constraint wrt the neural …
learning. This often requires maximizing the likelihood of a symbolic constraint wrt the neural …