Pseudo-Marginal slice sampling
Iain Murray, and Matthew M. Graham.
Markov chain Monte Carlo (MCMC) methods asymptotically sample from complex probability distributions. The pseudo-marginal MCMC framework only requires an unbiased estimator of the unnormalized probability distribution function to construct a Markov chain. However, the resulting chains are harder to tune to a target distribution than conventional MCMC, and the types of updates available are limited. We describe a general way to clamp and update the random numbers used in a pseudo-marginal method’s unbiased estimator. In this framework we can use slice sampling and other adaptive methods. We obtain more robust Markov chains, which often mix more quickly.
Appeared in The Proceedings of the 19th International Conference on Artificial Intelligence and Statistics (AISTATS), JMLR W&CP 51:911–919, 2016. [PDF, DjVu, GoogleViewer, arXiv, BibTeX]
Matt made a nice poster.
Code:
- A simple python module — probably the easiest way to quickly combine pseudo-marginal slice sampling with other code. If random-number generation is a significant cost, you may want a faster, more custom implementation.
- Matt Graham’s broader python code and GP demonstrations
- For completeness: Toy and Ising experiments code (Matlab and C). Less recommended!
See also: This paper uses Elliptical Slice Sampling. Pseudo-marginal slice sampling could be used in MCMC-ABC, however you might also want to check out an ABC alternative instead. If you're interested in work in this space, you should also check out Matt and Amos's work on constrained HMC applied to some ABC-like problems.