[HTML][HTML] A reproducible survey on word embeddings and ontology-based methods for word similarity: Linear combinations outperform the state of the art
Human similarity and relatedness judgements between concepts underlie most of cognitive
capabilities, such as categorisation, memory, decision-making and reasoning. For this …
capabilities, such as categorisation, memory, decision-making and reasoning. For this …
Null it out: Guarding protected attributes by iterative nullspace projection
The ability to control for the kinds of information encoded in neural representation has a
variety of use cases, especially in light of the challenge of interpreting these models. We …
variety of use cases, especially in light of the challenge of interpreting these models. We …
Conceptnet 5.5: An open multilingual graph of general knowledge
Abstract Machine learning about language can be improved by supplying it with specific
knowledge and sources of external information. We present here a new version of the linked …
knowledge and sources of external information. We present here a new version of the linked …
Evaluating word embedding models: Methods and experimental results
Extensive evaluation on a large number of word embedding models for language
processing applications is conducted in this work. First, we introduce popular word …
processing applications is conducted in this work. First, we introduce popular word …
Using the output embedding to improve language models
We study the topmost weight matrix of neural network language models. We show that this
matrix constitutes a valid word embedding. When training language models, we recommend …
matrix constitutes a valid word embedding. When training language models, we recommend …
The “Small World of Words” English word association norms for over 12,000 cue words
Word associations have been used widely in psychology, but the validity of their application
strongly depends on the number of cues included in the study and the extent to which they …
strongly depends on the number of cues included in the study and the extent to which they …
Learning gender-neutral word embeddings
Word embedding models have become a fundamental component in a wide range of
Natural Language Processing (NLP) applications. However, embeddings trained on human …
Natural Language Processing (NLP) applications. However, embeddings trained on human …
[PDF][PDF] Don't count, predict! a systematic comparison of context-counting vs. context-predicting semantic vectors
Context-predicting models (more commonly known as embeddings or neural language
models) are the new kids on the distributional semantics block. Despite the buzz …
models) are the new kids on the distributional semantics block. Despite the buzz …
On the dimensionality of word embedding
In this paper, we provide a theoretical understanding of word embedding and its
dimensionality. Motivated by the unitary-invariance of word embedding, we propose the …
dimensionality. Motivated by the unitary-invariance of word embedding, we propose the …
Rethinking embedding coupling in pre-trained language models
We re-evaluate the standard practice of sharing weights between input and output
embeddings in state-of-the-art pre-trained language models. We show that decoupled …
embeddings in state-of-the-art pre-trained language models. We show that decoupled …