A comprehensive survey of continual learning: theory, method and application
To cope with real-world dynamics, an intelligent system needs to incrementally acquire,
update, accumulate, and exploit knowledge throughout its lifetime. This ability, known as …
update, accumulate, and exploit knowledge throughout its lifetime. This ability, known as …
A comprehensive survey of forgetting in deep learning beyond continual learning
Forgetting refers to the loss or deterioration of previously acquired knowledge. While
existing surveys on forgetting have primarily focused on continual learning, forgetting is a …
existing surveys on forgetting have primarily focused on continual learning, forgetting is a …
Plex: Towards reliability using pretrained large model extensions
A recent trend in artificial intelligence is the use of pretrained models for language and
vision tasks, which have achieved extraordinary performance but also puzzling failures …
vision tasks, which have achieved extraordinary performance but also puzzling failures …
Make continual learning stronger via C-flat
A Bian, W Li, H Yuan, M Wang, Z Zhao… - Advances in …, 2025 - proceedings.neurips.cc
How to balance the learning'sensitivity-stability'upon new task training and memory
preserving is critical in CL to resolve catastrophic forgetting. Improving model generalization …
preserving is critical in CL to resolve catastrophic forgetting. Improving model generalization …
Pre-train your loss: Easy bayesian transfer learning with informative priors
Deep learning is increasingly moving towards a transfer learning paradigm whereby large
foundation models are fine-tuned on downstream tasks, starting from an initialization …
foundation models are fine-tuned on downstream tasks, starting from an initialization …
Benchmarking bayesian deep learning on diabetic retinopathy detection tasks
Bayesian deep learning seeks to equip deep neural networks with the ability to precisely
quantify their predictive uncertainty, and has promised to make deep learning more reliable …
quantify their predictive uncertainty, and has promised to make deep learning more reliable …
Function-space regularization in neural networks: A probabilistic perspective
Parameter-space regularization in neural network optimization is a fundamental tool for
improving generalization. However, standard parameter-space regularization methods …
improving generalization. However, standard parameter-space regularization methods …
Tractable function-space variational inference in bayesian neural networks
Reliable predictive uncertainty estimation plays an important role in enabling the
deployment of neural networks to safety-critical settings. A popular approach for estimating …
deployment of neural networks to safety-critical settings. A popular approach for estimating …
Should we learn most likely functions or parameters?
Standard regularized training procedures correspond to maximizing a posterior distribution
over parameters, known as maximum a posteriori (MAP) estimation. However, model …
over parameters, known as maximum a posteriori (MAP) estimation. However, model …
A study of Bayesian neural network surrogates for Bayesian optimization
Bayesian optimization is a highly efficient approach to optimizing objective functions which
are expensive to query. These objectives are typically represented by Gaussian process …
are expensive to query. These objectives are typically represented by Gaussian process …