[HTML][HTML] A review of uncertainty quantification in deep learning: Techniques, applications and challenges

M Abdar, F Pourpanah, S Hussain, D Rezazadegan… - Information fusion, 2021 - Elsevier
Uncertainty quantification (UQ) methods play a pivotal role in reducing the impact of
uncertainties during both optimization and decision making processes. They have been …

Priors in bayesian deep learning: A review

V Fortuin - International Statistical Review, 2022 - Wiley Online Library
While the choice of prior is one of the most critical parts of the Bayesian inference workflow,
recent Bayesian deep learning models have often fallen back on vague priors, such as …

Transformer neural processes: Uncertainty-aware meta learning via sequence modeling

T Nguyen, A Grover - arxiv preprint arxiv:2207.04179, 2022 - arxiv.org
Neural Processes (NPs) are a popular class of approaches for meta-learning. Similar to
Gaussian Processes (GPs), NPs define distributions over functions and can estimate …

Neural diffusion processes

V Dutordoir, A Saul, Z Ghahramani… - … on Machine Learning, 2023 - proceedings.mlr.press
Neural network approaches for meta-learning distributions over functions have desirable
properties such as increased flexibility and a reduced complexity of inference. Building on …

Autoregressive conditional neural processes

WP Bruinsma, S Markou, J Requiema… - arxiv preprint arxiv …, 2023 - arxiv.org
Conditional neural processes (CNPs; Garnelo et al., 2018a) are attractive meta-learning
models which produce well-calibrated predictions and are trainable via a simple maximum …

The neural process family: Survey, applications and perspectives

S Jha, D Gong, X Wang, RE Turner, L Yao - arxiv preprint arxiv …, 2022 - arxiv.org
The standard approaches to neural network implementation yield powerful function
approximation capabilities but are limited in their abilities to learn meta representations and …

A simple yet effective strategy to robustify the meta learning paradigm

Q Wang, Y Lv, Z **e, J Huang - Advances in Neural …, 2024 - proceedings.neurips.cc
Meta learning is a promising paradigm to enable skill transfer across tasks. Most previous
methods employ the empirical risk minimization principle in optimization. However, the …

Affective processes: stochastic modelling of temporal context for emotion and facial expression recognition

E Sanchez, MK Tellamekala… - Proceedings of the …, 2021 - openaccess.thecvf.com
Temporal context is key to the recognition of expressions of emotion. Existing methods, that
rely on recurrent or self-attention models to enforce temporal consistency, work on the …

Contrastive conditional neural processes

Z Ye, L Yao - Proceedings of the IEEE/CVF Conference on …, 2022 - openaccess.thecvf.com
Abstract Conditional Neural Processes (CNPs) bridge neural networks with probabilistic
inference to approximate functions of Stochastic Processes under meta-learning settings …

How tight can PAC-Bayes be in the small data regime?

A Foong, W Bruinsma, D Burt… - Advances in Neural …, 2021 - proceedings.neurips.cc
In this paper, we investigate the question: _Given a small number of datapoints, for example
$ N= 30$, how tight can PAC-Bayes and test set bounds be made? _ For such small …