Emergent multi-agent communication in the deep learning era
The ability to cooperate through language is a defining feature of humans. As the
perceptual, motory and planning capabilities of deep artificial networks increase …
perceptual, motory and planning capabilities of deep artificial networks increase …
Experience grounds language
Language understanding research is held back by a failure to relate language to the
physical world it describes and to the social interactions it facilitates. Despite the incredible …
physical world it describes and to the social interactions it facilitates. Despite the incredible …
Towards principled disentanglement for domain generalization
A fundamental challenge for machine learning models is generalizing to out-of-distribution
(OOD) data, in part due to spurious correlations. To tackle this challenge, we first formalize …
(OOD) data, in part due to spurious correlations. To tackle this challenge, we first formalize …
Learning from teaching regularization: Generalizable correlations should be easy to imitate
Generalization remains a central challenge in machine learning. In this work, we propose
Learning from Teaching (LoT), a novel regularization technique for deep neural networks to …
Learning from Teaching (LoT), a novel regularization technique for deep neural networks to …
Compositional generalization in unsupervised compositional representation learning: A study on disentanglement and emergent language
Deep learning models struggle with compositional generalization, ie the ability to recognize
or generate novel combinations of observed elementary concepts. In hopes of enabling …
or generate novel combinations of observed elementary concepts. In hopes of enabling …
Emergent communication at scale
Emergent communication aims for a better understanding of human language evolution and
building more efficient representations. We posit that reaching these goals will require …
building more efficient representations. We posit that reaching these goals will require …
Iterated learning improves compositionality in large vision-language models
A fundamental characteristic common to both human vision and natural language is their
compositional nature. Yet despite the performance gains contributed by large vision and …
compositional nature. Yet despite the performance gains contributed by large vision and …
Meta-learning to compositionally generalize
Natural language is compositional; the meaning of a sentence is a function of the meaning
of its parts. This property allows humans to create and interpret novel sentences …
of its parts. This property allows humans to create and interpret novel sentences …
Compositionality in computational linguistics
Neural models greatly outperform grammar-based models across many tasks in modern
computational linguistics. This raises the question of whether linguistic principles, such as …
computational linguistics. This raises the question of whether linguistic principles, such as …
The role of disentanglement in generalisation
Combinatorial generalisation—the ability to understand and produce novel combinations of
familiar elements—is a core capacity of human intelligence that current AI systems struggle …
familiar elements—is a core capacity of human intelligence that current AI systems struggle …