[HTML][HTML] A review of uncertainty quantification in deep learning: Techniques, applications and challenges

M Abdar, F Pourpanah, S Hussain, D Rezazadegan… - Information fusion, 2021 - Elsevier
Uncertainty quantification (UQ) methods play a pivotal role in reducing the impact of
uncertainties during both optimization and decision making processes. They have been …

Priors in bayesian deep learning: A review

V Fortuin - International Statistical Review, 2022 - Wiley Online Library
While the choice of prior is one of the most critical parts of the Bayesian inference workflow,
recent Bayesian deep learning models have often fallen back on vague priors, such as …

Transformers can do bayesian inference

S Müller, N Hollmann, SP Arango, J Grabocka… - arxiv preprint arxiv …, 2021 - arxiv.org
Currently, it is hard to reap the benefits of deep learning for Bayesian methods, which allow
the explicit specification of prior knowledge and accurately capture model uncertainty. We …

Transformer neural processes: Uncertainty-aware meta learning via sequence modeling

T Nguyen, A Grover - arxiv preprint arxiv:2207.04179, 2022 - arxiv.org
Neural Processes (NPs) are a popular class of approaches for meta-learning. Similar to
Gaussian Processes (GPs), NPs define distributions over functions and can estimate …

Neural diffusion processes

V Dutordoir, A Saul, Z Ghahramani… - … on Machine Learning, 2023 - proceedings.mlr.press
Neural network approaches for meta-learning distributions over functions have desirable
properties such as increased flexibility and a reduced complexity of inference. Building on …

Learning to defer to a population: A meta-learning approach

D Tailor, A Patra, R Verma… - International …, 2024 - proceedings.mlr.press
The learning to defer (L2D) framework allows autonomous systems to be safe and robust by
allocating difficult decisions to a human expert. All existing work on L2D assumes that each …

The neural process family: Survey, applications and perspectives

S Jha, D Gong, X Wang, RE Turner, L Yao - arxiv preprint arxiv …, 2022 - arxiv.org
The standard approaches to neural network implementation yield powerful function
approximation capabilities but are limited in their abilities to learn meta representations and …

Autoregressive conditional neural processes

WP Bruinsma, S Markou, J Requiema… - arxiv preprint arxiv …, 2023 - arxiv.org
Conditional neural processes (CNPs; Garnelo et al., 2018a) are attractive meta-learning
models which produce well-calibrated predictions and are trainable via a simple maximum …

Graph neural processes for spatio-temporal extrapolation

J Hu, Y Liang, Z Fan, H Chen, Y Zheng… - Proceedings of the 29th …, 2023 - dl.acm.org
We study the task of spatio-temporal extrapolation that generates data at target locations
from surrounding contexts in a graph. This task is crucial as sensors that collect data are …

A simple yet effective strategy to robustify the meta learning paradigm

Q Wang, Y Lv, Z **e, J Huang - Advances in Neural …, 2023 - proceedings.neurips.cc
Meta learning is a promising paradigm to enable skill transfer across tasks. Most previous
methods employ the empirical risk minimization principle in optimization. However, the …