Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
WiC: the word-in-context dataset for evaluating context-sensitive meaning representations
By design, word embeddings are unable to model the dynamic nature of words' semantics,
ie, the property of words to correspond to potentially different meanings. To address this …
ie, the property of words to correspond to potentially different meanings. To address this …
From word to sense embeddings: A survey on vector representations of meaning
Over the past years, distributed semantic representations have proved to be effective and
flexible keepers of prior knowledge to be integrated into downstream applications. This …
flexible keepers of prior knowledge to be integrated into downstream applications. This …
From word types to tokens and back: A survey of approaches to word meaning representation and interpretation
M Apidianaki - Computational Linguistics, 2023 - direct.mit.edu
Vector-based word representation paradigms situate lexical meaning at different levels of
abstraction. Distributional and static embedding models generate a single vector per word …
abstraction. Distributional and static embedding models generate a single vector per word …
Natural language understanding: Instructions for (present and future) use
R Navigli - Proceedings of the 27th International Joint Conference …, 2018 - iris.uniroma1.it
In this paper I look at Natural Language Understanding, an area of Natural Language
Processing aimed at making sense of text, through the lens of a visionary future: what do we …
Processing aimed at making sense of text, through the lens of a visionary future: what do we …
Bridge text and knowledge by learning multi-prototype entity mention embedding
Integrating text and knowledge into a unified semantic space has attracted significant
research interests recently. However, the ambiguity in the common space remains a …
research interests recently. However, the ambiguity in the common space remains a …
Math-word embedding in math search and semantic extraction
Word embedding, which represents individual words with semantically fixed-length vectors,
has made it possible to successfully apply deep learning to natural language processing …
has made it possible to successfully apply deep learning to natural language processing …
[HTML][HTML] LMMS reloaded: Transformer-based sense embeddings for disambiguation and beyond
Distributional semantics based on neural approaches is a cornerstone of Natural Language
Processing, with surprising connections to human meaning representation as well. Recent …
Processing, with surprising connections to human meaning representation as well. Recent …
What does this word mean? explaining contextualized embeddings with natural language definition
Contextualized word embeddings have boosted many NLP tasks compared with traditional
static word embeddings. However, the word with a specific sense may have different …
static word embeddings. However, the word with a specific sense may have different …
Lu-bzu at semeval-2021 task 2: Word2vec and lemma2vec performance in arabic word-in-context disambiguation
This paper presents a set of experiments to evaluate and compare between the performance
of using CBOW Word2Vec and Lemma2Vec models for Arabic Word-in-Context (WiC) …
of using CBOW Word2Vec and Lemma2Vec models for Arabic Word-in-Context (WiC) …
Concept representation by learning explicit and implicit concept couplings
Generating the precise semantic representation of a word or concept is a fundamental task
in natural language processing. Recent studies which incorporate semantic knowledge into …
in natural language processing. Recent studies which incorporate semantic knowledge into …