Continual lifelong learning in natural language processing: A survey

M Biesialska, K Biesialska, MR Costa-Jussa - arxiv preprint arxiv …, 2020 - arxiv.org
Continual learning (CL) aims to enable information systems to learn from a continuous data
stream across time. However, it is difficult for existing deep learning architectures to learn a …

Bias in data‐driven artificial intelligence systems—An introductory survey

E Ntoutsi, P Fafalios, U Gadiraju… - … : Data Mining and …, 2020 - Wiley Online Library
Artificial Intelligence (AI)‐based systems are widely employed nowadays to make decisions
that have far‐reaching impact on individuals and society. Their decisions might affect …

SemEval-2020 task 1: Unsupervised lexical semantic change detection

D Schlechtweg, B McGillivray, S Hengchen… - arxiv preprint arxiv …, 2020 - arxiv.org
Lexical Semantic Change detection, ie, the task of identifying words that change meaning
over time, is a very active research area, with applications in NLP, lexicography, and …

Distributional semantics and linguistic theory

G Boleda - Annual Review of Linguistics, 2020 - annualreviews.org
Distributional semantics provides multidimensional, graded, empirically induced word
representations that successfully capture many aspects of meaning in natural languages, as …

Time masking for temporal language models

GD Rosin, I Guy, K Radinsky - … conference on Web search and data …, 2022 - dl.acm.org
Our world is constantly evolving, and so is the content on the web. Consequently, our
languages, often said to mirror the world, are dynamic in nature. However, most current …

Diachronic sense modeling with deep contextualized word embeddings: An ecological view

R Hu, S Li, S Liang - Proceedings of the 57th annual meeting of …, 2019 - aclanthology.org
Diachronic word embeddings have been widely used in detecting temporal changes.
However, existing methods face the meaning conflation deficiency by representing a word …

Time-out: Temporal referencing for robust modeling of lexical semantic change

H Dubossarsky, S Hengchen, N Tahmasebi… - arxiv preprint arxiv …, 2019 - arxiv.org
State-of-the-art models of lexical semantic change detection suffer from noise stemming from
vector space alignment. We have empirically tested the Temporal Referencing method for …

Temporal attention for language models

GD Rosin, K Radinsky - arxiv preprint arxiv:2202.02093, 2022 - arxiv.org
Pretrained language models based on the transformer architecture have shown great
success in NLP. Textual training data often comes from the web and is thus tagged with time …

Embedding regression: Models for context-specific description and inference

PL Rodriguez, A Spirling, BM Stewart - American Political Science …, 2023 - cambridge.org
Social scientists commonly seek to make statements about how word use varies over
circumstances—including time, partisan identity, or some other document-level covariate …

Room to Glo: A systematic comparison of semantic change detection approaches with word embeddings

P Shoemark, FF Liza, D Nguyen, SA Hale… - 2019 - repository.cam.ac.uk
Word embeddings are increasingly used for the automatic detection of semantic change; yet,
a robust evaluation and systematic comparison of the choices involved has been lacking …