[HTML][HTML] Language in brains, minds, and machines
It has long been argued that only humans could produce and understand language. But
now, for the first time, artificial language models (LMs) achieve this feat. Here we survey the …
now, for the first time, artificial language models (LMs) achieve this feat. Here we survey the …
Short-text semantic similarity (stss): Techniques, challenges and future perspectives
In natural language processing, short-text semantic similarity (STSS) is a very prominent
field. It has a significant impact on a broad range of applications, such as question …
field. It has a significant impact on a broad range of applications, such as question …
A primer in BERTology: What we know about how BERT works
Transformer-based models have pushed state of the art in many areas of NLP, but our
understanding of what is behind their success is still limited. This paper is the first survey of …
understanding of what is behind their success is still limited. This paper is the first survey of …
Bertology meets biology: Interpreting attention in protein language models
Transformer architectures have proven to learn useful representations for protein
classification and generation tasks. However, these representations present challenges in …
classification and generation tasks. However, these representations present challenges in …
Implicit representations of meaning in neural language models
Does the effectiveness of neural language models derive entirely from accurate modeling of
surface word co-occurrence statistics, or do these models represent and reason about the …
surface word co-occurrence statistics, or do these models represent and reason about the …
Word meaning in minds and machines.
Abstract Machines have achieved a broad and growing set of linguistic competencies,
thanks to recent progress in Natural Language Processing (NLP). Psychologists have …
thanks to recent progress in Natural Language Processing (NLP). Psychologists have …
From word types to tokens and back: A survey of approaches to word meaning representation and interpretation
M Apidianaki - Computational Linguistics, 2023 - direct.mit.edu
Vector-based word representation paradigms situate lexical meaning at different levels of
abstraction. Distributional and static embedding models generate a single vector per word …
abstraction. Distributional and static embedding models generate a single vector per word …
A comparative evaluation and analysis of three generations of Distributional Semantic Models
Distributional semantics has deeply changed in the last decades. First, predict models stole
the thunder from traditional count ones, and more recently both of them were replaced in …
the thunder from traditional count ones, and more recently both of them were replaced in …
Let's Play Mono-Poly: BERT Can Reveal Words' Polysemy Level and Partitionability into Senses
Pre-trained language models (LMs) encode rich information about linguistic structure but
their knowledge about lexical polysemy remains unclear. We propose a novel experimental …
their knowledge about lexical polysemy remains unclear. We propose a novel experimental …
[CARTE][B] Distributional semantics
A Lenci, M Sahlgren - 2023 - books.google.com
Distributional semantics develops theories and methods to represent the meaning of natural
language expressions, with vectors encoding their statistical distribution in linguistic …
language expressions, with vectors encoding their statistical distribution in linguistic …