Wide neural networks forget less catastrophically
A primary focus area in continual learning research is alleviating the" catastrophic forgetting"
problem in neural networks by designing new algorithms that are more robust to the …
problem in neural networks by designing new algorithms that are more robust to the …
Architecture matters in continual learning
A large body of research in continual learning is devoted to overcoming the catastrophic
forgetting of neural networks by designing new algorithms that are robust to the distribution …
forgetting of neural networks by designing new algorithms that are robust to the distribution …
Efficient parametric approximations of neural network function space distance
It is often useful to compactly summarize important properties of model parameters and
training data so that they can be used later without storing and/or iterating over the entire …
training data so that they can be used later without storing and/or iterating over the entire …
One pass imagenet
We present the One Pass ImageNet (OPIN) problem, which aims to study the effectiveness
of deep learning in a streaming setting. ImageNet is a widely known benchmark dataset that …
of deep learning in a streaming setting. ImageNet is a widely known benchmark dataset that …
Continual learning beyond a single model
A growing body of research in continual learning focuses on the catastrophic forgetting
problem. While many attempts have been made to alleviate this problem, the majority of the …
problem. While many attempts have been made to alleviate this problem, the majority of the …
TAME: Task Agnostic Continual Learning using Multiple Experts
The goal of lifelong learning is to continuously learn from non-stationary distributions where
the non-stationarity is typically imposed by a sequence of distinct tasks. Prior works have …
the non-stationarity is typically imposed by a sequence of distinct tasks. Prior works have …
Continual Learning with Neuromorphic Computing: Theories, Methods, and Applications
To adapt to real-world dynamics, intelligent systems need to assimilate new knowledge
without catastrophic forgetting, where learning new tasks leads to a degradation in …
without catastrophic forgetting, where learning new tasks leads to a degradation in …
Simulating task-free continual learning streams from existing datasets
A Chrysakis, MF Moens - … of the IEEE/CVF Conference on …, 2023 - openaccess.thecvf.com
Task-free continual learning is the subfield of machine learning that focuses on learning
online from a stream whose distribution changes continuously over time. In contrast …
online from a stream whose distribution changes continuously over time. In contrast …
Continual learning on 3D point clouds with random compressed rehearsal
Contemporary deep neural networks offer state-of-the-art results when applied to visual
reasoning, eg, in the context of 3D point cloud data. Point clouds are an important data type …
reasoning, eg, in the context of 3D point cloud data. Point clouds are an important data type …
GMM-IL: Image classification using incrementally learnt, independent probabilistic models for small sample sizes
When deep-learning classifiers try to learn new classes through supervised learning, they
exhibit catastrophic forgetting issues. In this paper we propose the Gaussian Mixture Model …
exhibit catastrophic forgetting issues. In this paper we propose the Gaussian Mixture Model …