Flaubert: Unsupervised language model pre-training for french

H Le, L Vial, J Frej, V Segonne, M Coavoux… - arxiv preprint arxiv …, 2019 - arxiv.org
Language models have become a key step to achieve state-of-the art results in many
different Natural Language Processing (NLP) tasks. Leveraging the huge amount of …

Does BERT make any sense? Interpretable word sense disambiguation with contextualized embeddings

G Wiedemann, S Remus, A Chawla… - arxiv preprint arxiv …, 2019 - arxiv.org
Contextualized word embeddings (CWE) such as provided by ELMo (Peters et al., 2018),
Flair NLP (Akbik et al., 2018), or BERT (Devlin et al., 2019) are a major recent innovation in …

Sense vocabulary compression through the semantic knowledge of wordnet for neural word sense disambiguation

L Vial, B Lecouteux, D Schwab - arxiv preprint arxiv:1905.05677, 2019 - arxiv.org
In this article, we tackle the issue of the limited quantity of manually sense annotated corpora
for the task of word sense disambiguation, by exploiting the semantic relationships between …

Analysis and evaluation of language models for word sense disambiguation

D Loureiro, K Rezaee, MT Pilehvar… - Computational …, 2021 - direct.mit.edu
Transformer-based language models have taken many fields in NLP by storm. BERT and its
derivatives dominate most of the existing evaluation benchmarks, including those for Word …

Neural Sequence-to-Sequence Modeling with Attention by Leveraging Deep Learning Architectures for Enhanced Contextual Understanding in Abstractive Text …

BC Challagundla, C Peddavenkatagari - arxiv preprint arxiv:2404.08685, 2024 - arxiv.org
Automatic text summarization (TS) plays a pivotal role in condensing large volumes of
information into concise, coherent summaries, facilitating efficient information retrieval and …

A synset relation-enhanced framework with a try-again mechanism for word sense disambiguation

M Wang, Y Wang - Proceedings of the 2020 conference on …, 2020 - aclanthology.org
Contextual embeddings are proved to be overwhelmingly effective to the task of Word Sense
Disambiguation (WSD) compared with other sense representation techniques. However …

Improved word sense disambiguation with enhanced sense representations

Y Song, XC Ong, HT Ng, Q Lin - Findings of the Association for …, 2021 - aclanthology.org
Current state-of-the-art supervised word sense disambiguation (WSD) systems (such as
GlossBERT and bi-encoder model) yield surprisingly good results by purely leveraging pre …

[HTML][HTML] LMMS reloaded: Transformer-based sense embeddings for disambiguation and beyond

D Loureiro, AM Jorge, J Camacho-Collados - Artificial Intelligence, 2022 - Elsevier
Distributional semantics based on neural approaches is a cornerstone of Natural Language
Processing, with surprising connections to human meaning representation as well. Recent …

Sparsity makes sense: Word sense disambiguation using sparse contextualized word representations

G Berend - Proceedings of the 2020 Conference on Empirical …, 2020 - aclanthology.org
In this paper, we demonstrate that by utilizing sparse word representations, it becomes
possible to surpass the results of more complex task-specific models on the task of fine …

WiC-TSV: An evaluation benchmark for target sense verification of words in context

A Breit, A Revenko, K Rezaee, MT Pilehvar… - arxiv preprint arxiv …, 2020 - arxiv.org
We present WiC-TSV, a new multi-domain evaluation benchmark for Word Sense
Disambiguation. More specifically, we introduce a framework for Target Sense Verification of …