A comprehensive survey of continual learning: theory, method and application

L Wang, X Zhang, H Su, J Zhu - IEEE Transactions on Pattern …, 2024 - ieeexplore.ieee.org
To cope with real-world dynamics, an intelligent system needs to incrementally acquire,
update, accumulate, and exploit knowledge throughout its lifetime. This ability, known as …

A comprehensive survey of forgetting in deep learning beyond continual learning

Z Wang, E Yang, L Shen… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Forgetting refers to the loss or deterioration of previously acquired knowledge. While
existing surveys on forgetting have primarily focused on continual learning, forgetting is a …

Slca: Slow learner with classifier alignment for continual learning on a pre-trained model

G Zhang, L Wang, G Kang… - Proceedings of the …, 2023 - openaccess.thecvf.com
The goal of continual learning is to improve the performance of recognition models in
learning sequentially arrived data. Although most existing works are established on the …

Hierarchical decomposition of prompt-based continual learning: Rethinking obscured sub-optimality

L Wang, J **e, X Zhang, M Huang… - Advances in Neural …, 2024 - proceedings.neurips.cc
Prompt-based continual learning is an emerging direction in leveraging pre-trained
knowledge for downstream continual learning, and has almost reached the performance …

Self-supervised learning is more robust to dataset imbalance

H Liu, JZ HaoChen, A Gaidon, T Ma - arxiv preprint arxiv:2110.05025, 2021 - arxiv.org
Self-supervised learning (SSL) is a scalable way to learn general visual representations
since it learns without labels. However, large-scale unlabeled datasets in the wild often have …

Audio-visual class-incremental learning

W Pian, S Mo, Y Guo, Y Tian - Proceedings of the IEEE/CVF …, 2023 - openaccess.thecvf.com
In this paper, we introduce audio-visual class-incremental learning, a class-incremental
learning scenario for audio-visual video recognition. We demonstrate that joint audio-visual …

On the effectiveness of lipschitz-driven rehearsal in continual learning

L Bonicelli, M Boschini, A Porrello… - Advances in …, 2022 - proceedings.neurips.cc
Rehearsal approaches enjoy immense popularity with Continual Learning (CL)
practitioners. These methods collect samples from previously encountered data distributions …

[HTML][HTML] Continual pre-training mitigates forgetting in language and vision

A Cossu, A Carta, L Passaro, V Lomonaco… - Neural Networks, 2024 - Elsevier
Pre-trained models are commonly used in Continual Learning to initialize the model before
training on the stream of non-stationary data. However, pre-training is rarely applied during …

Continual Pre-Training of Large Language Models: How to (re) warm your model?

K Gupta, B Thérien, A Ibrahim, ML Richter… - arxiv preprint arxiv …, 2023 - arxiv.org
Large language models (LLMs) are routinely pre-trained on billions of tokens, only to restart
the process over again once new data becomes available. A much cheaper and more …

A comprehensive empirical evaluation on online continual learning

A Soutif-Cormerais, A Carta, A Cossu… - Proceedings of the …, 2023 - openaccess.thecvf.com
Online continual learning aims to get closer to a live learning experience by learning directly
on a stream of data with temporally shifting distribution and by storing a minimum amount of …