Representations and generalization in artificial and brain neural networks
Humans and animals excel at generalizing from limited data, a capability yet to be fully
replicated in artificial intelligence. This perspective investigates generalization in biological …
replicated in artificial intelligence. This perspective investigates generalization in biological …
Language in brains, minds, and machines
It has long been argued that only humans could produce and understand language. But
now, for the first time, artificial language models (LMs) achieve this feat. Here we survey the …
now, for the first time, artificial language models (LMs) achieve this feat. Here we survey the …
Large language models demonstrate the potential of statistical learning in language
P Contreras Kallens… - Cognitive …, 2023 - Wiley Online Library
To what degree can language be acquired from linguistic input alone? This question has
vexed scholars for millennia and is still a major focus of debate in the cognitive science of …
vexed scholars for millennia and is still a major focus of debate in the cognitive science of …
[HTML][HTML] A shared model-based linguistic space for transmitting our thoughts from brain to brain in natural conversations
Effective communication hinges on a mutual understanding of word meaning in different
contexts. We recorded brain activity using electrocorticography during spontaneous, face-to …
contexts. We recorded brain activity using electrocorticography during spontaneous, face-to …
Reconstructing the cascade of language processing in the brain using the internal computations of a transformer-based language model
Piecing together the meaning of a narrative requires understanding not only the individual
words but also the intricate relationships between them. How does the brain construct this …
words but also the intricate relationships between them. How does the brain construct this …
Shared functional specialization in transformer-based language models and the human brain
When processing language, the brain is thought to deploy specialized computations to
construct meaning from complex linguistic structures. Recently, artificial neural networks …
construct meaning from complex linguistic structures. Recently, artificial neural networks …
From word vectors to multimodal embeddings: Techniques, applications, and future directions for large language models
Word embeddings and language models have transformed natural language processing
(NLP) by facilitating the representation of linguistic elements in continuous vector spaces …
(NLP) by facilitating the representation of linguistic elements in continuous vector spaces …
[HTML][HTML] A shared linguistic space for transmitting our thoughts from brain to brain in natural conversations
Effective communication hinges on a mutual understanding of word meaning in different
contexts. The embedding space learned by large language models can serve as an explicit …
contexts. The embedding space learned by large language models can serve as an explicit …
Words, Subwords, and Morphemes: What Really Matters in the Surprisal-Reading Time Relationship?
An important assumption that comes with using LLMs on psycholinguistic data has gone
unverified. LLM-based predictions are based on subword tokenization, not decomposition of …
unverified. LLM-based predictions are based on subword tokenization, not decomposition of …
[HTML][HTML] Navigating the semantic space: Unraveling the structure of meaning in psychosis using different computational language models
Speech in psychosis has long been ascribed as involving 'loosening of associations'. We
pursued the aim to elucidate its underlying cognitive mechanisms by analysing picture …
pursued the aim to elucidate its underlying cognitive mechanisms by analysing picture …