Elliptical slice sampling

Iain Murray, Ryan Prescott Adams and David J.C. MacKay.

Many probabilistic models introduce strong dependencies between variables using a latent multivariate Gaussian distribution or a Gaussian process. We present a new Markov chain Monte Carlo algorithm for performing inference in models with multivariate Gaussian priors. Its key properties are: 1) it has simple, generic code applicable to many models, 2) it has no free parameters, 3) it works well for a variety of Gaussian process based models. These properties make our method ideal for use while model building, removing the need to spend time deriving and tuning updates for more complex algorithms.

Appeared in The Proceedings of the 13th International Conference on Artificial Intelligence and Statistics (AISTATS), JMLR W&CP 9:541–548, 2010. [PDF, DjVu, GoogleViewer, arXiv, Poster, BibTeX]

See also:

Code

Complete code to reproduce all of the reported results is available: ess_code.tar.gz. However, users of the algorithm probably just want the single stand-alone Matlab/Octave file gppu_elliptical.m. Note: Michalis Titsias points out that the function add_call_counter.m in the tar-ball is inefficient, it would be better to count likelihood evaluations in another way.

Since the paper has been published I use a version with a slightly modified interface. The new version is available as elliptical_slice.m. This function can optionally be passed a prior sample, instead of the Cholesky factorization of the prior covariance, which may be more efficient in some applications.

Elliptical slice sampling has been ported and used by other researchers. For example, there is a translation to Python/Numpy by Jo Bovy, and it is the default MCMC method within the GPstuff toolbox.