Explainable AI in medical imaging: An overview for clinical practitioners–Beyond saliency-based XAI approaches
Driven by recent advances in Artificial Intelligence (AI) and Computer Vision (CV), the
implementation of AI systems in the medical domain increased correspondingly. This is …
implementation of AI systems in the medical domain increased correspondingly. This is …
Priors in bayesian deep learning: A review
V Fortuin - International Statistical Review, 2022 - Wiley Online Library
While the choice of prior is one of the most critical parts of the Bayesian inference workflow,
recent Bayesian deep learning models have often fallen back on vague priors, such as …
recent Bayesian deep learning models have often fallen back on vague priors, such as …
Three types of incremental learning
Incrementally learning new information from a non-stationary stream of data, referred to as
'continual learning', is a key feature of natural intelligence, but a challenging problem for …
'continual learning', is a key feature of natural intelligence, but a challenging problem for …
A survey of uncertainty in deep neural networks
Over the last decade, neural networks have reached almost every field of science and
become a crucial part of various real world applications. Due to the increasing spread …
become a crucial part of various real world applications. Due to the increasing spread …
Laplace redux-effortless bayesian deep learning
Bayesian formulations of deep learning have been shown to have compelling theoretical
properties and offer practical functional benefits, such as improved predictive uncertainty …
properties and offer practical functional benefits, such as improved predictive uncertainty …
Simple and principled uncertainty estimation with deterministic deep learning via distance awareness
Bayesian neural networks (BNN) and deep ensembles are principled approaches to
estimate the predictive uncertainty of a deep learning model. However their practicality in …
estimate the predictive uncertainty of a deep learning model. However their practicality in …
Improving predictions of Bayesian neural nets via local linearization
Abstract The generalized Gauss-Newton (GGN) approximation is often used to make
practical Bayesian deep learning approaches scalable by replacing a second order …
practical Bayesian deep learning approaches scalable by replacing a second order …
Scalable marginal likelihood estimation for model selection in deep learning
Marginal-likelihood based model-selection, even though promising, is rarely used in deep
learning due to estimation difficulties. Instead, most approaches rely on validation data …
learning due to estimation difficulties. Instead, most approaches rely on validation data …
Bayesian deep ensembles via the neural tangent kernel
B He, B Lakshminarayanan… - Advances in neural …, 2020 - proceedings.neurips.cc
We explore the link between deep ensembles and Gaussian processes (GPs) through the
lens of the Neural Tangent Kernel (NTK): a recent development in understanding the …
lens of the Neural Tangent Kernel (NTK): a recent development in understanding the …
Fast finite width neural tangent kernel
R Novak, J Sohl-Dickstein… - … on Machine Learning, 2022 - proceedings.mlr.press
Abstract The Neural Tangent Kernel (NTK), defined as the outer product of the neural
network (NN) Jacobians, has emerged as a central object of study in deep learning. In the …
network (NN) Jacobians, has emerged as a central object of study in deep learning. In the …