Gaussian Processes and Fast Matrix-Vector Multiplies

Iain Murray, 2009.

Gaussian processes (GPs) provide a flexible framework for probabilistic regression. The necessary computations involve standard matrix operations. There have been several attempts to accelerate these operations based on fast kernel matrix-vector multiplications. By focussing on the simplest GP computation, corresponding to test-time predictions in kernel ridge regression, we conclude that simple approximations based on clusterings in a kd-tree can never work well for simple regression problems. Analytical expansions can provide speedups, but current implementations are limited to the squared-exponential kernel and low-dimensional problems. We discuss future directions.

[PDF, DjVu, JavaDjVu, GoogleViewer, BibTeX]

The talk, available online, covered slightly different material.