WiC: the word-in-context dataset for evaluating context-sensitive meaning representations
By design, word embeddings are unable to model the dynamic nature of words' semantics,
ie, the property of words to correspond to potentially different meanings. To address this …
ie, the property of words to correspond to potentially different meanings. To address this …
From word to sense embeddings: A survey on vector representations of meaning
Over the past years, distributed semantic representations have proved to be effective and
flexible keepers of prior knowledge to be integrated into downstream applications. This …
flexible keepers of prior knowledge to be integrated into downstream applications. This …
Natural language understanding: Instructions for (present and future) use
R Navigli - Proceedings of the 27th International Joint Conference …, 2018 - iris.uniroma1.it
In this paper I look at Natural Language Understanding, an area of Natural Language
Processing aimed at making sense of text, through the lens of a visionary future: what do we …
Processing aimed at making sense of text, through the lens of a visionary future: what do we …
From word types to tokens and back: A survey of approaches to word meaning representation and interpretation
M Apidianaki - Computational Linguistics, 2023 - direct.mit.edu
Vector-based word representation paradigms situate lexical meaning at different levels of
abstraction. Distributional and static embedding models generate a single vector per word …
abstraction. Distributional and static embedding models generate a single vector per word …
Bridge text and knowledge by learning multi-prototype entity mention embedding
Integrating text and knowledge into a unified semantic space has attracted significant
research interests recently. However, the ambiguity in the common space remains a …
research interests recently. However, the ambiguity in the common space remains a …
Math-word embedding in math search and semantic extraction
Word embedding, which represents individual words with semantically fixed-length vectors,
has made it possible to successfully apply deep learning to natural language processing …
has made it possible to successfully apply deep learning to natural language processing …
[HTML][HTML] LMMS reloaded: Transformer-based sense embeddings for disambiguation and beyond
Distributional semantics based on neural approaches is a cornerstone of Natural Language
Processing, with surprising connections to human meaning representation as well. Recent …
Processing, with surprising connections to human meaning representation as well. Recent …
Concept representation by learning explicit and implicit concept couplings
Generating the precise semantic representation of a word or concept is a fundamental task
in natural language processing. Recent studies which incorporate semantic knowledge into …
in natural language processing. Recent studies which incorporate semantic knowledge into …
What does this word mean? explaining contextualized embeddings with natural language definition
Contextualized word embeddings have boosted many NLP tasks compared with traditional
static word embeddings. However, the word with a specific sense may have different …
static word embeddings. However, the word with a specific sense may have different …
Reducing disambiguation biases in NMT by leveraging explicit word sense information
Recent studies have shed some light on a common pitfall of Neural Machine Translation
(NMT) models, stemming from their struggle to disambiguate polysemous words without …
(NMT) models, stemming from their struggle to disambiguate polysemous words without …