Syntactic structure from deep learning
Modern deep neural networks achieve impressive performance in engineering applications
that require extensive linguistic skills, such as machine translation. This success has …
that require extensive linguistic skills, such as machine translation. This success has …
Neurocomputational models of language processing
Efforts to understand the brain bases of language face the Map** Problem: At what level
do linguistic computations and representations connect to human neurobiology? We review …
do linguistic computations and representations connect to human neurobiology? We review …
Brains and algorithms partially converge in natural language processing
Deep learning algorithms trained to predict masked words from large amount of text have
recently been shown to generate activations similar to those of the human brain. However …
recently been shown to generate activations similar to those of the human brain. However …
Why does surprisal from larger transformer-based language models provide a poorer fit to human reading times?
BD Oh, W Schuler - Transactions of the Association for Computational …, 2023 - direct.mit.edu
This work presents a linguistic analysis into why larger Transformer-based pre-trained
language models with more parameters and lower perplexity nonetheless yield surprisal …
language models with more parameters and lower perplexity nonetheless yield surprisal …
A hierarchy of linguistic predictions during natural language comprehension
Understanding spoken language requires transforming ambiguous acoustic streams into a
hierarchy of representations, from phonemes to meaning. It has been suggested that the …
hierarchy of representations, from phonemes to meaning. It has been suggested that the …
Localizing syntactic predictions using recurrent neural network grammars
Brain activity in numerous perisylvian brain regions is modulated by the expectedness of
linguistic stimuli. We leverage recent advances in computational parsing models to test what …
linguistic stimuli. We leverage recent advances in computational parsing models to test what …
Interpreting and improving natural-language processing (in machines) with natural language-processing (in the brain)
Neural networks models for NLP are typically implemented without the explicit encoding of
language rules and yet they are able to break one performance record after another. This …
language rules and yet they are able to break one performance record after another. This …
Lossy‐context surprisal: An information‐theoretic model of memory effects in sentence processing
A key component of research on human sentence processing is to characterize the
processing difficulty associated with the comprehension of words in context. Models that …
processing difficulty associated with the comprehension of words in context. Models that …
Synchronous, but not entrained: exogenous and endogenous cortical rhythms of speech and language processing
Research on speech processing is often focused on a phenomenon termed “entrainment”,
whereby the cortex shadows rhythmic acoustic information with oscillatory activity …
whereby the cortex shadows rhythmic acoustic information with oscillatory activity …
Disentangling syntax and semantics in the brain with deep networks
The activations of language transformers like GPT-2 have been shown to linearly map onto
brain activity during speech comprehension. However, the nature of these activations …
brain activity during speech comprehension. However, the nature of these activations …