WiC: the word-in-context dataset for evaluating context-sensitive meaning representations

MT Pilehvar, J Camacho-Collados - arxiv preprint arxiv:1808.09121, 2018 - arxiv.org
By design, word embeddings are unable to model the dynamic nature of words' semantics,
ie, the property of words to correspond to potentially different meanings. To address this …

From word to sense embeddings: A survey on vector representations of meaning

J Camacho-Collados, MT Pilehvar - Journal of Artificial Intelligence …, 2018 - jair.org
Over the past years, distributed semantic representations have proved to be effective and
flexible keepers of prior knowledge to be integrated into downstream applications. This …

Natural language understanding: Instructions for (present and future) use

R Navigli - Proceedings of the 27th International Joint Conference …, 2018 - iris.uniroma1.it
In this paper I look at Natural Language Understanding, an area of Natural Language
Processing aimed at making sense of text, through the lens of a visionary future: what do we …

From word types to tokens and back: A survey of approaches to word meaning representation and interpretation

M Apidianaki - Computational Linguistics, 2023 - direct.mit.edu
Vector-based word representation paradigms situate lexical meaning at different levels of
abstraction. Distributional and static embedding models generate a single vector per word …

Bridge text and knowledge by learning multi-prototype entity mention embedding

Y Cao, L Huang, H Ji, X Chen, J Li - 2017 - ink.library.smu.edu.sg
Integrating text and knowledge into a unified semantic space has attracted significant
research interests recently. However, the ambiguity in the common space remains a …

Math-word embedding in math search and semantic extraction

A Greiner-Petter, A Youssef, T Ruas, BR Miller… - Scientometrics, 2020 - Springer
Word embedding, which represents individual words with semantically fixed-length vectors,
has made it possible to successfully apply deep learning to natural language processing …

[HTML][HTML] LMMS reloaded: Transformer-based sense embeddings for disambiguation and beyond

D Loureiro, AM Jorge, J Camacho-Collados - Artificial Intelligence, 2022 - Elsevier
Distributional semantics based on neural approaches is a cornerstone of Natural Language
Processing, with surprising connections to human meaning representation as well. Recent …

Concept representation by learning explicit and implicit concept couplings

W Lu, Y Zhang, S Wang, H Huang, Q Liu… - IEEE Intelligent …, 2020 - ieeexplore.ieee.org
Generating the precise semantic representation of a word or concept is a fundamental task
in natural language processing. Recent studies which incorporate semantic knowledge into …

What does this word mean? explaining contextualized embeddings with natural language definition

TY Chang, YN Chen - Proceedings of the 2019 Conference on …, 2019 - aclanthology.org
Contextualized word embeddings have boosted many NLP tasks compared with traditional
static word embeddings. However, the word with a specific sense may have different …

Reducing disambiguation biases in NMT by leveraging explicit word sense information

N Campolungo, T Pasini, D Emelin… - Proceedings of the 2022 …, 2022 - aclanthology.org
Recent studies have shed some light on a common pitfall of Neural Machine Translation
(NMT) models, stemming from their struggle to disambiguate polysemous words without …