On a class of Gibbs sampling over networks
We consider the sampling problem from a composite distribution whose potential (negative
log density) is $\sum_ {i= 1}^ n f_i (x_i)+\sum_ {j= 1}^ m g_j (y_j)+\sum_ {i= 1}^ n\sum_ {j …
log density) is $\sum_ {i= 1}^ n f_i (x_i)+\sum_ {j= 1}^ m g_j (y_j)+\sum_ {i= 1}^ n\sum_ {j …
DP-Fast MH: Private, fast, and accurate Metropolis-Hastings for large-scale Bayesian inference
Bayesian inference provides a principled framework for learning from complex data and
reasoning under uncertainty. It has been widely applied in machine learning tasks such as …
reasoning under uncertainty. It has been widely applied in machine learning tasks such as …
Asymptotically optimal exact minibatch metropolis-hastings
Metropolis-Hastings (MH) is a commonly-used MCMC algorithm, but it can be intractable on
large datasets due to requiring computations over the whole dataset. In this paper, we …
large datasets due to requiring computations over the whole dataset. In this paper, we …
Markov chain Monte Carlo without evaluating the target: an auxiliary variable approach
W Yuan, G Wang - ar** a DA extension that exploits …
Advances in approximate inference: combining VI and MCMC and improving on Stein discrepancy
W Gong - 2022 - repository.cam.ac.uk
In the modern world, machine learning, including deep learning, has become an
indispensable part of many intelligent systems, hel** people automate the decision …
indispensable part of many intelligent systems, hel** people automate the decision …
Where is the normative proof? Assumptions and contradictions in ML fairness research
AF Cooper - arxiv preprint arxiv:2010.10407, 2020 - arxiv.org
Across machine learning (ML) sub-disciplines researchers make mathematical assumptions
to facilitate proof-writing. While such assumptions are necessary for providing mathematical …
to facilitate proof-writing. While such assumptions are necessary for providing mathematical …
Improving sampling accuracy of stochastic gradient MCMC methods via non-uniform subsampling of gradients
Many Markov Chain Monte Carlo (MCMC) methods leverage gradient information of the
potential function of target distribution to explore sample space efficiently. However …
potential function of target distribution to explore sample space efficiently. However …
Skip the Steps: Data-Free Consistency Distillation for Diffusion-based Samplers
PJ Dube, A Bera, R Zhang - openreview.net
Sampling from probability distributions is a fundamental task in machine learning and
statistics. However, most existing algorithms require numerous iterative steps to transform a …
statistics. However, most existing algorithms require numerous iterative steps to transform a …