Is neuro-symbolic AI meeting its promises in natural language processing? A structured review

K Hamilton, A Nayak, B Božić, L Longo - Semantic Web, 2024 - content.iospress.com
Abstract Advocates for Neuro-Symbolic Artificial Intelligence (NeSy) assert that combining
deep learning with symbolic reasoning will lead to stronger AI than either paradigm on its …

Human-like systematic generalization through a meta-learning neural network

BM Lake, M Baroni - Nature, 2023 - nature.com
The power of human language and thought arises from systematic compositionality—the
algebraic ability to understand and produce novel combinations from known components …

Least-to-most prompting enables complex reasoning in large language models

D Zhou, N Schärli, L Hou, J Wei, N Scales… - arxiv preprint arxiv …, 2022 - arxiv.org
Chain-of-thought prompting has demonstrated remarkable performance on various natural
language reasoning tasks. However, it tends to perform poorly on tasks which requires …

Compositional semantic parsing with large language models

A Drozdov, N Schärli, E Akyürek, N Scales… - arxiv preprint arxiv …, 2022 - arxiv.org
Humans can reason compositionally when presented with new tasks. Previous research
shows that appropriate prompting techniques enable large language models (LLMs) to …

How to reuse and compose knowledge for a lifetime of tasks: A survey on continual learning and functional composition

JA Mendez, E Eaton - arxiv preprint arxiv:2207.07730, 2022 - arxiv.org
A major goal of artificial intelligence (AI) is to create an agent capable of acquiring a general
understanding of the world. Such an agent would require the ability to continually …

An empirical survey of data augmentation for limited data learning in NLP

J Chen, D Tam, C Raffel, M Bansal… - Transactions of the …, 2023 - direct.mit.edu
NLP has achieved great progress in the past decade through the use of neural models and
large labeled datasets. The dependence on abundant data prevents NLP models from being …

Neuro-symbolic artificial intelligence: Current trends

MK Sarker, L Zhou, A Eberhart… - Ai …, 2022 - journals.sagepub.com
Neuro-Symbolic Artificial Intelligence–the combination of symbolic methods with methods
that are based on artificial neural networks–has a long-standing history. In this article, we …

Large language models can learn rules

Z Zhu, Y Xue, X Chen, D Zhou, J Tang… - arxiv preprint arxiv …, 2023 - arxiv.org
When prompted with a few examples and intermediate steps, large language models (LLMs)
have demonstrated impressive performance in various reasoning tasks. However, prompting …

The devil is in the detail: Simple tricks improve systematic generalization of transformers

R Csordás, K Irie, J Schmidhuber - arxiv preprint arxiv:2108.12284, 2021 - arxiv.org
Recently, many datasets have been proposed to test the systematic generalization ability of
neural networks. The companion baseline Transformers, typically trained with default hyper …

Compositional generalization and natural language variation: Can a semantic parsing approach handle both?

P Shaw, MW Chang, P Pasupat… - arxiv preprint arxiv …, 2020 - arxiv.org
Sequence-to-sequence models excel at handling natural language variation, but have been
shown to struggle with out-of-distribution compositional generalization. This has motivated …