[HTML][HTML] Language in brains, minds, and machines

G Tuckute, N Kanwisher… - Annual Review of …, 2024 - annualreviews.org
It has long been argued that only humans could produce and understand language. But
now, for the first time, artificial language models (LMs) achieve this feat. Here we survey the …

Short-text semantic similarity (stss): Techniques, challenges and future perspectives

ZH Amur, Y Kwang Hooi, H Bhanbhro, K Dahri… - Applied Sciences, 2023 - mdpi.com
In natural language processing, short-text semantic similarity (STSS) is a very prominent
field. It has a significant impact on a broad range of applications, such as question …

A primer in BERTology: What we know about how BERT works

A Rogers, O Kovaleva, A Rumshisky - Transactions of the association …, 2021 - direct.mit.edu
Transformer-based models have pushed state of the art in many areas of NLP, but our
understanding of what is behind their success is still limited. This paper is the first survey of …

Bertology meets biology: Interpreting attention in protein language models

J Vig, A Madani, LR Varshney, C **ong… - arxiv preprint arxiv …, 2020 - arxiv.org
Transformer architectures have proven to learn useful representations for protein
classification and generation tasks. However, these representations present challenges in …

Implicit representations of meaning in neural language models

BZ Li, M Nye, J Andreas - arxiv preprint arxiv:2106.00737, 2021 - arxiv.org
Does the effectiveness of neural language models derive entirely from accurate modeling of
surface word co-occurrence statistics, or do these models represent and reason about the …

Word meaning in minds and machines.

BM Lake, GL Murphy - Psychological review, 2023 - psycnet.apa.org
Abstract Machines have achieved a broad and growing set of linguistic competencies,
thanks to recent progress in Natural Language Processing (NLP). Psychologists have …

From word types to tokens and back: A survey of approaches to word meaning representation and interpretation

M Apidianaki - Computational Linguistics, 2023 - direct.mit.edu
Vector-based word representation paradigms situate lexical meaning at different levels of
abstraction. Distributional and static embedding models generate a single vector per word …

A comparative evaluation and analysis of three generations of Distributional Semantic Models

A Lenci, M Sahlgren, P Jeuniaux… - Language resources …, 2022 - Springer
Distributional semantics has deeply changed in the last decades. First, predict models stole
the thunder from traditional count ones, and more recently both of them were replaced in …

Let's Play Mono-Poly: BERT Can Reveal Words' Polysemy Level and Partitionability into Senses

A Garí Soler, M Apidianaki - Transactions of the Association for …, 2021 - direct.mit.edu
Pre-trained language models (LMs) encode rich information about linguistic structure but
their knowledge about lexical polysemy remains unclear. We propose a novel experimental …

[CARTE][B] Distributional semantics

A Lenci, M Sahlgren - 2023 - books.google.com
Distributional semantics develops theories and methods to represent the meaning of natural
language expressions, with vectors encoding their statistical distribution in linguistic …