Bayesian statistics and modelling

R Van de Schoot, S Depaoli, R King… - Nature Reviews …, 2021 - nature.com
Bayesian statistics is an approach to data analysis based on Bayes' theorem, where
available knowledge about parameters in a statistical model is updated with the information …

Priors in bayesian deep learning: A review

V Fortuin - International Statistical Review, 2022 - Wiley Online Library
While the choice of prior is one of the most critical parts of the Bayesian inference workflow,
recent Bayesian deep learning models have often fallen back on vague priors, such as …

[PDF][PDF] International conference on machine learning

W Li, C Wang, G Cheng, Q Song - Transactions on machine learning …, 2023 - par.nsf.gov
In this paper, we make the key delineation on the roles of resolution and statistical
uncertainty in hierarchical bandits-based black-box optimization algorithms, guiding a more …

What can transformers learn in-context? a case study of simple function classes

S Garg, D Tsipras, PS Liang… - Advances in Neural …, 2022 - proceedings.neurips.cc
In-context learning is the ability of a model to condition on a prompt sequence consisting of
in-context examples (input-output pairs corresponding to some task) along with a new query …

Diffusion with forward models: Solving stochastic inverse problems without direct supervision

A Tewari, T Yin, G Cazenavette… - Advances in …, 2023 - proceedings.neurips.cc
Denoising diffusion models are a powerful type of generative models used to capture
complex distributions of real-world signals. However, their applicability is limited to …

Implicit neural representations with periodic activation functions

V Sitzmann, J Martel, A Bergman… - Advances in neural …, 2020 - proceedings.neurips.cc
Implicitly defined, continuous, differentiable signal representations parameterized by neural
networks have emerged as a powerful paradigm, offering many possible benefits over …

From data to functa: Your data point is a function and you can treat it like one

E Dupont, H Kim, SM Eslami, D Rezende… - arxiv preprint arxiv …, 2022 - arxiv.org
It is common practice in deep learning to represent a measurement of the world on a
discrete grid, eg a 2D grid of pixels. However, the underlying signal represented by these …

Transformers can do bayesian inference

S Müller, N Hollmann, SP Arango, J Grabocka… - arxiv preprint arxiv …, 2021 - arxiv.org
Currently, it is hard to reap the benefits of deep learning for Bayesian methods, which allow
the explicit specification of prior knowledge and accurately capture model uncertainty. We …

Card: Classification and regression diffusion models

X Han, H Zheng, M Zhou - Advances in Neural Information …, 2022 - proceedings.neurips.cc
Learning the distribution of a continuous or categorical response variable y given its
covariates x is a fundamental problem in statistics and machine learning. Deep neural …

Set transformer: A framework for attention-based permutation-invariant neural networks

J Lee, Y Lee, J Kim, A Kosiorek… - … on machine learning, 2019 - proceedings.mlr.press
Many machine learning tasks such as multiple instance learning, 3D shape recognition, and
few-shot image classification are defined on sets of instances. Since solutions to such …