Continual lifelong learning in natural language processing: A survey
Continual learning (CL) aims to enable information systems to learn from a continuous data
stream across time. However, it is difficult for existing deep learning architectures to learn a …
stream across time. However, it is difficult for existing deep learning architectures to learn a …
Bias in data‐driven artificial intelligence systems—An introductory survey
Artificial Intelligence (AI)‐based systems are widely employed nowadays to make decisions
that have far‐reaching impact on individuals and society. Their decisions might affect …
that have far‐reaching impact on individuals and society. Their decisions might affect …
SemEval-2020 task 1: Unsupervised lexical semantic change detection
Lexical Semantic Change detection, ie, the task of identifying words that change meaning
over time, is a very active research area, with applications in NLP, lexicography, and …
over time, is a very active research area, with applications in NLP, lexicography, and …
Distributional semantics and linguistic theory
G Boleda - Annual Review of Linguistics, 2020 - annualreviews.org
Distributional semantics provides multidimensional, graded, empirically induced word
representations that successfully capture many aspects of meaning in natural languages, as …
representations that successfully capture many aspects of meaning in natural languages, as …
Time masking for temporal language models
Our world is constantly evolving, and so is the content on the web. Consequently, our
languages, often said to mirror the world, are dynamic in nature. However, most current …
languages, often said to mirror the world, are dynamic in nature. However, most current …
Diachronic sense modeling with deep contextualized word embeddings: An ecological view
Diachronic word embeddings have been widely used in detecting temporal changes.
However, existing methods face the meaning conflation deficiency by representing a word …
However, existing methods face the meaning conflation deficiency by representing a word …
Time-out: Temporal referencing for robust modeling of lexical semantic change
State-of-the-art models of lexical semantic change detection suffer from noise stemming from
vector space alignment. We have empirically tested the Temporal Referencing method for …
vector space alignment. We have empirically tested the Temporal Referencing method for …
Temporal attention for language models
Pretrained language models based on the transformer architecture have shown great
success in NLP. Textual training data often comes from the web and is thus tagged with time …
success in NLP. Textual training data often comes from the web and is thus tagged with time …
Embedding regression: Models for context-specific description and inference
Social scientists commonly seek to make statements about how word use varies over
circumstances—including time, partisan identity, or some other document-level covariate …
circumstances—including time, partisan identity, or some other document-level covariate …
Room to Glo: A systematic comparison of semantic change detection approaches with word embeddings
Word embeddings are increasingly used for the automatic detection of semantic change; yet,
a robust evaluation and systematic comparison of the choices involved has been lacking …
a robust evaluation and systematic comparison of the choices involved has been lacking …