Human-like systematic generalization through a meta-learning neural network
The power of human language and thought arises from systematic compositionality—the
algebraic ability to understand and produce novel combinations from known components …
algebraic ability to understand and produce novel combinations from known components …
Least-to-most prompting enables complex reasoning in large language models
Chain-of-thought prompting has demonstrated remarkable performance on various natural
language reasoning tasks. However, it tends to perform poorly on tasks which requires …
language reasoning tasks. However, it tends to perform poorly on tasks which requires …
Meta-learning in neural networks: A survey
The field of meta-learning, or learning-to-learn, has seen a dramatic rise in interest in recent
years. Contrary to conventional approaches to AI where tasks are solved from scratch using …
years. Contrary to conventional approaches to AI where tasks are solved from scratch using …
Compositional semantic parsing with large language models
Humans can reason compositionally when presented with new tasks. Previous research
shows that appropriate prompting techniques enable large language models (LLMs) to …
shows that appropriate prompting techniques enable large language models (LLMs) to …
COGS: A compositional generalization challenge based on semantic interpretation
Natural language is characterized by compositionality: the meaning of a complex expression
is constructed from the meanings of its constituent parts. To facilitate the evaluation of the …
is constructed from the meanings of its constituent parts. To facilitate the evaluation of the …
Survey of low-resource machine translation
We present a survey covering the state of the art in low-resource machine translation (MT)
research. There are currently around 7,000 languages spoken in the world and almost all …
research. There are currently around 7,000 languages spoken in the world and almost all …
Compositional generalization and natural language variation: Can a semantic parsing approach handle both?
Sequence-to-sequence models excel at handling natural language variation, but have been
shown to struggle with out-of-distribution compositional generalization. This has motivated …
shown to struggle with out-of-distribution compositional generalization. This has motivated …
The devil is in the detail: Simple tricks improve systematic generalization of transformers
Recently, many datasets have been proposed to test the systematic generalization ability of
neural networks. The companion baseline Transformers, typically trained with default hyper …
neural networks. The companion baseline Transformers, typically trained with default hyper …
How to reuse and compose knowledge for a lifetime of tasks: A survey on continual learning and functional composition
A major goal of artificial intelligence (AI) is to create an agent capable of acquiring a general
understanding of the world. Such an agent would require the ability to continually …
understanding of the world. Such an agent would require the ability to continually …
The CLRS algorithmic reasoning benchmark
Learning representations of algorithms is an emerging area of machine learning, seeking to
bridge concepts from neural networks with classical algorithms. Several important works …
bridge concepts from neural networks with classical algorithms. Several important works …