Differentially Private Statistical Inference through -Divergence One Posterior Sampling

JE Jewson, S Ghalebikesabi… - Advances in Neural …, 2023 - proceedings.neurips.cc
Differential privacy guarantees allow the results of a statistical analysis involving sensitive
data to be released without compromising the privacy of any individual taking part …

The Impact of Loss Estimation on Gibbs Measures

DT Frazier, J Knoblauch, C Drovandi - arxiv preprint arxiv:2404.15649, 2024 - arxiv.org
In recent years, the shortcomings of Bayes posteriors as inferential devices has received
increased attention. A popular strategy for fixing them has been to instead target a Gibbs …

On the meaning of uncertainty for ethical AI: philosophy and practice

C Bird, D Williamson, S Leonelli - arxiv preprint arxiv:2309.05529, 2023 - arxiv.org
Whether and how data scientists, statisticians and modellers should be accountable for the
AI systems they develop remains a controversial and highly debated topic, especially given …

Generalised Bayes Linear Inference

L Astfalck, C Bird, D Williamson - arxiv preprint arxiv:2405.14145, 2024 - arxiv.org
Motivated by big data and the vast parameter spaces in modern machine learning models,
optimisation approaches to Bayesian inference have seen a surge in popularity in recent …

Sampling from Density power divergence-based Generalized posterior distribution via Stochastic optimization

N Sonobe, T Momozaki, T Nakagawa - arxiv preprint arxiv:2501.07790, 2025 - arxiv.org
Robust Bayesian inference using density power divergence (DPD) has emerged as a
promising approach for handling outliers in statistical estimation. While the DPD-based …