Integrating Present and Past in Unsupervised Continual Learning

Y Zhang, L Charlin, R Zemel, M Ren - arxiv preprint arxiv:2404.19132, 2024 - arxiv.org
We formulate a unifying framework for unsupervised continual learning (UCL), which
disentangles learning objectives that are specific to the present and the past data …

Watch Your Step: Optimal Retrieval for Continual Learning at Scale

T Hickok, D Kudithipudi - arxiv preprint arxiv:2404.10758, 2024 - arxiv.org
One of the most widely used approaches in continual learning is referred to as replay.
Replay methods support interleaved learning by storing past experiences in a replay buffer …

Understanding the Causes of Forgetting in Continually Learned Neural Networks

A Soutif-Cormerais - 2024 - ddd.uab.cat
The use of deep learning has become increasingly popular in the last years in many
application fields such as the ones of computer vision and natural language processing …

Stabilizing Zero-Shot Prediction: A Novel Antidote to Forgetting in Continual Vision-Language Tasks

Z Gao, X Zhang, K Xu, X Mao, H Wang - The Thirty-eighth Annual … - openreview.net
Continual learning (CL) empowers pre-trained vision-language (VL) models to efficiently
adapt to a sequence of downstream tasks. However, these models often encounter …

Optimal Retrieval for Continual Learning at Scale

T Hickok - 2024 - search.proquest.com
In recent years, deep neural networks have emerged as extremely scalable machine
learning models. These models are often trained on billions of samples; however, as gaps in …

[PDF][PDF] MEMORY HEAD FOR PRE-TRAINED BACKBONES IN CONTINUAL LEARNING

M Tiezzi, F Becattini, S Marullo, S Melacci - researchgate.net
This paper is focused on the role of classification heads for pre-trained backbones in the
context of continual learning. A novel neuron model is proposed as basic constituent of what …