Tractable control for autoregressive language generation

H Zhang, M Dang, N Peng… - … on Machine Learning, 2023 - proceedings.mlr.press
Despite the success of autoregressive large language models in text generation, it remains
a major challenge to generate text that satisfies complex constraints: sampling from the …

A-nesi: A scalable approximate method for probabilistic neurosymbolic inference

E van Krieken, T Thanapalasingam… - Advances in …, 2023 - proceedings.neurips.cc
We study the problem of combining neural networks with symbolic reasoning. Recently
introduced frameworks for Probabilistic Neurosymbolic Learning (PNL), such as …

Not all neuro-symbolic concepts are created equal: Analysis and mitigation of reasoning shortcuts

E Marconato, S Teso, A Vergari… - Advances in Neural …, 2024 - proceedings.neurips.cc
Abstract Neuro-Symbolic (NeSy) predictive models hold the promise of improved
compliance with given constraints, systematic generalization, and interpretability, as they …

Semantic strengthening of neuro-symbolic learning

K Ahmed, KW Chang… - … Conference on Artificial …, 2023 - proceedings.mlr.press
Numerous neuro-symbolic approaches have recently been proposed typically with the goal
of adding symbolic knowledge to the output layer of a neural network. Ideally, such losses …

Safe reinforcement learning via probabilistic logic shields

WC Yang, G Marra, G Rens, L De Raedt - arxiv preprint arxiv:2303.03226, 2023 - arxiv.org
Safe Reinforcement learning (Safe RL) aims at learning optimal policies while staying safe.
A popular solution to Safe RL is shielding, which uses a logical safety specification to …

A Unified Approach to Count-Based Weakly Supervised Learning

V Shukla, Z Zeng, K Ahmed… - Advances in Neural …, 2024 - proceedings.neurips.cc
High-quality labels are often very scarce, whereas unlabeled data with inferred weak labels
occurs more naturally. In many cases, these weak labels dictate the frequency of each …

Neuro-symbolic continual learning: Knowledge, reasoning shortcuts and concept rehearsal

E Marconato, G Bontempo, E Ficarra… - arxiv preprint arxiv …, 2023 - arxiv.org
We introduce Neuro-Symbolic Continual Learning, where a model has to solve a sequence
of neuro-symbolic tasks, that is, it has to map sub-symbolic inputs to high-level concepts and …

[HTML][HTML] CCN+: A neuro-symbolic framework for deep learning with requirements

E Giunchiglia, A Tatomir, MC Stoian… - International Journal of …, 2024 - Elsevier
For their outstanding ability of finding hidden patterns in data, deep learning models have
been extensively applied in many different domains. However, recent works have shown …

Collapsed inference for bayesian deep learning

Z Zeng, G Van den Broeck - Advances in Neural …, 2023 - proceedings.neurips.cc
Bayesian neural networks (BNNs) provide a formalism to quantify and calibrate uncertainty
in deep learning. Current inference approaches for BNNs often resort to few-sample …

A pseudo-semantic loss for autoregressive models with logical constraints

K Ahmed, KW Chang… - Advances in Neural …, 2024 - proceedings.neurips.cc
Neuro-symbolic AI bridges the gap between purely symbolic and neural approaches to
learning. This often requires maximizing the likelihood of a symbolic constraint wrt the neural …