Differentially Private Statistical Inference through -Divergence One Posterior Sampling
Differential privacy guarantees allow the results of a statistical analysis involving sensitive
data to be released without compromising the privacy of any individual taking part …
data to be released without compromising the privacy of any individual taking part …
The Impact of Loss Estimation on Gibbs Measures
In recent years, the shortcomings of Bayes posteriors as inferential devices has received
increased attention. A popular strategy for fixing them has been to instead target a Gibbs …
increased attention. A popular strategy for fixing them has been to instead target a Gibbs …
On the meaning of uncertainty for ethical AI: philosophy and practice
Whether and how data scientists, statisticians and modellers should be accountable for the
AI systems they develop remains a controversial and highly debated topic, especially given …
AI systems they develop remains a controversial and highly debated topic, especially given …
Generalised Bayes Linear Inference
Motivated by big data and the vast parameter spaces in modern machine learning models,
optimisation approaches to Bayesian inference have seen a surge in popularity in recent …
optimisation approaches to Bayesian inference have seen a surge in popularity in recent …
Sampling from Density power divergence-based Generalized posterior distribution via Stochastic optimization
Robust Bayesian inference using density power divergence (DPD) has emerged as a
promising approach for handling outliers in statistical estimation. While the DPD-based …
promising approach for handling outliers in statistical estimation. While the DPD-based …