[HTML][HTML] A review of uncertainty quantification in deep learning: Techniques, applications and challenges

M Abdar, F Pourpanah, S Hussain, D Rezazadegan… - Information fusion, 2021‏ - Elsevier
Uncertainty quantification (UQ) methods play a pivotal role in reducing the impact of
uncertainties during both optimization and decision making processes. They have been …

Priors in bayesian deep learning: A review

V Fortuin - International Statistical Review, 2022‏ - Wiley Online Library
While the choice of prior is one of the most critical parts of the Bayesian inference workflow,
recent Bayesian deep learning models have often fallen back on vague priors, such as …

Efficient continual learning with modular networks and task-driven priors

T Veniat, L Denoyer, MA Ranzato - arxiv preprint arxiv:2012.12631, 2020‏ - arxiv.org
Existing literature in Continual Learning (CL) has focused on overcoming catastrophic
forgetting, the inability of the learner to recall how to perform tasks observed in the past …

Adaptive compositional continual meta-learning

B Wu, J Fang, X Zeng, S Liang… - … on Machine Learning, 2023‏ - proceedings.mlr.press
This paper focuses on continual meta-learning, where few-shot tasks are heterogeneous
and sequentially available. Recent works use a mixture model for meta-knowledge to deal …

Dangers of Bayesian model averaging under covariate shift

P Izmailov, P Nicholson, S Lotfi… - Advances in Neural …, 2021‏ - proceedings.neurips.cc
Approximate Bayesian inference for neural networks is considered a robust alternative to
standard training, often providing good performance on out-of-distribution data. However …

Same state, different task: Continual reinforcement learning without interference

S Kessler, J Parker-Holder, P Ball, S Zohren… - Proceedings of the …, 2022‏ - ojs.aaai.org
Continual Learning (CL) considers the problem of training an agent sequentially on a set of
tasks while seeking to retain performance on all previous tasks. A key challenge in CL is …

Continual learning using a bayesian nonparametric dictionary of weight factors

N Mehta, K Liang, VK Verma… - … Conference on Artificial …, 2021‏ - proceedings.mlr.press
Naively trained neural networks tend to experience catastrophic forgetting in sequential task
settings, where data from previous tasks are unavailable. A number of methods, using …

[HTML][HTML] Online continual learning through unsupervised mutual information maximization

H Hihn, DA Braun - Neurocomputing, 2024‏ - Elsevier
Catastrophic forgetting remains a challenge for artificial learning systems, especially in the
case of Online learning, where task information is unavailable. This work proposes a novel …

A continual learning framework for uncertainty-aware interactive image segmentation

E Zheng, Q Yu, R Li, P Shi, A Haake - Proceedings of the AAAI …, 2021‏ - ojs.aaai.org
Deep learning models have achieved state-of-the-art performance in semantic image
segmentation, but the results provided by fully automatic algorithms are not always …

Hierarchically structured task-agnostic continual learning

H Hihn, DA Braun - Machine Learning, 2023‏ - Springer
One notable weakness of current machine learning algorithms is the poor ability of models
to solve new problems without forgetting previously acquired knowledge. The Continual …