Wide neural networks forget less catastrophically

SI Mirzadeh, A Chaudhry, D Yin, H Hu… - International …, 2022 - proceedings.mlr.press
A primary focus area in continual learning research is alleviating the" catastrophic forgetting"
problem in neural networks by designing new algorithms that are more robust to the …

Architecture matters in continual learning

SI Mirzadeh, A Chaudhry, D Yin, T Nguyen… - arxiv preprint arxiv …, 2022 - arxiv.org
A large body of research in continual learning is devoted to overcoming the catastrophic
forgetting of neural networks by designing new algorithms that are robust to the distribution …

Efficient parametric approximations of neural network function space distance

N Dhawan, S Huang, J Bae… - … Conference on Machine …, 2023 - proceedings.mlr.press
It is often useful to compactly summarize important properties of model parameters and
training data so that they can be used later without storing and/or iterating over the entire …

One pass imagenet

H Hu, A Li, D Calandriello, D Gorur - arxiv preprint arxiv:2111.01956, 2021 - arxiv.org
We present the One Pass ImageNet (OPIN) problem, which aims to study the effectiveness
of deep learning in a streaming setting. ImageNet is a widely known benchmark dataset that …

Continual learning beyond a single model

T Doan, SI Mirzadeh… - Conference on Lifelong …, 2023 - proceedings.mlr.press
A growing body of research in continual learning focuses on the catastrophic forgetting
problem. While many attempts have been made to alleviate this problem, the majority of the …

TAME: Task Agnostic Continual Learning using Multiple Experts

H Zhu, M Majzoubi, A Jain… - Proceedings of the …, 2024 - openaccess.thecvf.com
The goal of lifelong learning is to continuously learn from non-stationary distributions where
the non-stationarity is typically imposed by a sequence of distinct tasks. Prior works have …

Continual Learning with Neuromorphic Computing: Theories, Methods, and Applications

MF Minhas, RVW Putra, F Awwad, O Hasan… - arxiv preprint arxiv …, 2024 - arxiv.org
To adapt to real-world dynamics, intelligent systems need to assimilate new knowledge
without catastrophic forgetting, where learning new tasks leads to a degradation in …

Simulating task-free continual learning streams from existing datasets

A Chrysakis, MF Moens - … of the IEEE/CVF Conference on …, 2023 - openaccess.thecvf.com
Task-free continual learning is the subfield of machine learning that focuses on learning
online from a stream whose distribution changes continuously over time. In contrast …

Continual learning on 3D point clouds with random compressed rehearsal

M Zamorski, M Stypułkowski, K Karanowski… - Computer Vision and …, 2023 - Elsevier
Contemporary deep neural networks offer state-of-the-art results when applied to visual
reasoning, eg, in the context of 3D point cloud data. Point clouds are an important data type …

GMM-IL: Image classification using incrementally learnt, independent probabilistic models for small sample sizes

P Johnston, K Nogueira, K Swingler - IEEE Access, 2023 - ieeexplore.ieee.org
When deep-learning classifiers try to learn new classes through supervised learning, they
exhibit catastrophic forgetting issues. In this paper we propose the Gaussian Mixture Model …