Human-like systematic generalization through a meta-learning neural network

BM Lake, M Baroni - Nature, 2023 - nature.com
The power of human language and thought arises from systematic compositionality—the
algebraic ability to understand and produce novel combinations from known components …

Least-to-most prompting enables complex reasoning in large language models

D Zhou, N Schärli, L Hou, J Wei, N Scales… - arxiv preprint arxiv …, 2022 - arxiv.org
Chain-of-thought prompting has demonstrated remarkable performance on various natural
language reasoning tasks. However, it tends to perform poorly on tasks which requires …

Meta-learning in neural networks: A survey

T Hospedales, A Antoniou, P Micaelli… - IEEE transactions on …, 2021 - ieeexplore.ieee.org
The field of meta-learning, or learning-to-learn, has seen a dramatic rise in interest in recent
years. Contrary to conventional approaches to AI where tasks are solved from scratch using …

Compositional semantic parsing with large language models

A Drozdov, N Schärli, E Akyürek, N Scales… - The Eleventh …, 2022 - openreview.net
Humans can reason compositionally when presented with new tasks. Previous research
shows that appropriate prompting techniques enable large language models (LLMs) to …

COGS: A compositional generalization challenge based on semantic interpretation

N Kim, T Linzen - Proceedings of the 2020 conference on …, 2020 - aclanthology.org
Natural language is characterized by compositionality: the meaning of a complex expression
is constructed from the meanings of its constituent parts. To facilitate the evaluation of the …

Survey of low-resource machine translation

B Haddow, R Bawden, AVM Barone, J Helcl… - Computational …, 2022 - direct.mit.edu
We present a survey covering the state of the art in low-resource machine translation (MT)
research. There are currently around 7,000 languages spoken in the world and almost all …

Compositional generalization and natural language variation: Can a semantic parsing approach handle both?

P Shaw, MW Chang, P Pasupat… - arxiv preprint arxiv …, 2020 - arxiv.org
Sequence-to-sequence models excel at handling natural language variation, but have been
shown to struggle with out-of-distribution compositional generalization. This has motivated …

The devil is in the detail: Simple tricks improve systematic generalization of transformers

R Csordás, K Irie, J Schmidhuber - arxiv preprint arxiv:2108.12284, 2021 - arxiv.org
Recently, many datasets have been proposed to test the systematic generalization ability of
neural networks. The companion baseline Transformers, typically trained with default hyper …

How to reuse and compose knowledge for a lifetime of tasks: A survey on continual learning and functional composition

JA Mendez, E Eaton - arxiv preprint arxiv:2207.07730, 2022 - arxiv.org
A major goal of artificial intelligence (AI) is to create an agent capable of acquiring a general
understanding of the world. Such an agent would require the ability to continually …

The CLRS algorithmic reasoning benchmark

P Veličković, AP Badia, D Budden… - International …, 2022 - proceedings.mlr.press
Learning representations of algorithms is an emerging area of machine learning, seeking to
bridge concepts from neural networks with classical algorithms. Several important works …