The better your syntax, the better your semantics? Probing pretrained language models for the English comparative correlative

L Weissweiler, V Hofmann, A Köksal… - arxiv preprint arxiv …, 2022 - arxiv.org
Construction Grammar (CxG) is a paradigm from cognitive linguistics emphasising the
connection between syntax and semantics. Rather than rules that operate on lexical items, it …

Structural persistence in language models: Priming as a window into abstract language representations

A Sinclair, J Jumelet, W Zuidema… - Transactions of the …, 2022 - direct.mit.edu
We investigate the extent to which modern neural language models are susceptible to
structural priming, the phenomenon whereby the structure of a sentence makes the same …

Construction grammar provides unique insight into neural language models

L Weissweiler, T He, N Otani, DR Mortensen… - arxiv preprint arxiv …, 2023 - arxiv.org
Construction Grammar (CxG) has recently been used as the basis for probing studies that
have investigated the performance of large pretrained language models (PLMs) with respect …

A discerning several thousand judgments: GPT-3 rates the article+ adjective+ numeral+ noun construction

K Mahowald - arxiv preprint arxiv:2301.12564, 2023 - arxiv.org
Knowledge of syntax includes knowledge of rare, idiosyncratic constructions. LLMs must
overcome frequency biases in order to master such constructions. In this study, I prompt GPT …

Structural priming demonstrates abstract grammatical representations in multilingual language models

JA Michaelov, C Arnett, TA Chang… - arxiv preprint arxiv …, 2023 - arxiv.org
Abstract grammatical knowledge-of parts of speech and grammatical patterns-is key to the
capacity for linguistic generalization in humans. But how abstract is grammatical knowledge …

Modeling Brain Representations of Words' Concreteness in Context Using GPT‐2 and Human Ratings

A Bruera, Y Tao, A Anderson, D Çokal… - Cognitive …, 2023 - Wiley Online Library
The meaning of most words in language depends on their context. Understanding how the
human brain extracts contextualized meaning, and identifying where in the brain this takes …

Neural Generative Models and the Parallel Architecture of Language: A Critical Review and Outlook

G Rambelli, E Chersoni, D Testa… - Topics in cognitive …, 2024 - Wiley Online Library
According to the parallel architecture, syntactic and semantic information processing are two
separate streams that interact selectively during language comprehension. While …

Using collostructional analysis to evaluate BERT's representation of linguistic constructions

T Veenboer, J Bloem - Findings of the Association for …, 2023 - aclanthology.org
Collostructional analysis is a technique devised to find correlations between particular
words and linguistic constructions in order to analyse meaning associations of these …

Construction grammar and language models

HT Madabushi, L Romain, P Milin, D Divjak - arxiv preprint arxiv …, 2023 - arxiv.org
Recent progress in deep learning and natural language processing has given rise to
powerful models that are primarily trained on a cloze-like task and show some evidence of …

Language Models Learn Rare Phenomena from Less Rare Phenomena: The Case of the Missing AANNs

K Misra, K Mahowald - arxiv preprint arxiv:2403.19827, 2024 - arxiv.org
Language models learn rare syntactic phenomena, but it has been argued that they rely on
rote memorization, as opposed to grammatical generalization. Training on a corpus of …