The better your syntax, the better your semantics? Probing pretrained language models for the English comparative correlative
Construction Grammar (CxG) is a paradigm from cognitive linguistics emphasising the
connection between syntax and semantics. Rather than rules that operate on lexical items, it …
connection between syntax and semantics. Rather than rules that operate on lexical items, it …
Structural persistence in language models: Priming as a window into abstract language representations
We investigate the extent to which modern neural language models are susceptible to
structural priming, the phenomenon whereby the structure of a sentence makes the same …
structural priming, the phenomenon whereby the structure of a sentence makes the same …
Construction grammar provides unique insight into neural language models
Construction Grammar (CxG) has recently been used as the basis for probing studies that
have investigated the performance of large pretrained language models (PLMs) with respect …
have investigated the performance of large pretrained language models (PLMs) with respect …
A discerning several thousand judgments: GPT-3 rates the article+ adjective+ numeral+ noun construction
K Mahowald - arxiv preprint arxiv:2301.12564, 2023 - arxiv.org
Knowledge of syntax includes knowledge of rare, idiosyncratic constructions. LLMs must
overcome frequency biases in order to master such constructions. In this study, I prompt GPT …
overcome frequency biases in order to master such constructions. In this study, I prompt GPT …
Structural priming demonstrates abstract grammatical representations in multilingual language models
Abstract grammatical knowledge-of parts of speech and grammatical patterns-is key to the
capacity for linguistic generalization in humans. But how abstract is grammatical knowledge …
capacity for linguistic generalization in humans. But how abstract is grammatical knowledge …
Modeling Brain Representations of Words' Concreteness in Context Using GPT‐2 and Human Ratings
The meaning of most words in language depends on their context. Understanding how the
human brain extracts contextualized meaning, and identifying where in the brain this takes …
human brain extracts contextualized meaning, and identifying where in the brain this takes …
Neural Generative Models and the Parallel Architecture of Language: A Critical Review and Outlook
According to the parallel architecture, syntactic and semantic information processing are two
separate streams that interact selectively during language comprehension. While …
separate streams that interact selectively during language comprehension. While …
Using collostructional analysis to evaluate BERT's representation of linguistic constructions
T Veenboer, J Bloem - Findings of the Association for …, 2023 - aclanthology.org
Collostructional analysis is a technique devised to find correlations between particular
words and linguistic constructions in order to analyse meaning associations of these …
words and linguistic constructions in order to analyse meaning associations of these …
Construction grammar and language models
Recent progress in deep learning and natural language processing has given rise to
powerful models that are primarily trained on a cloze-like task and show some evidence of …
powerful models that are primarily trained on a cloze-like task and show some evidence of …
Language Models Learn Rare Phenomena from Less Rare Phenomena: The Case of the Missing AANNs
Language models learn rare syntactic phenomena, but it has been argued that they rely on
rote memorization, as opposed to grammatical generalization. Training on a corpus of …
rote memorization, as opposed to grammatical generalization. Training on a corpus of …