Note: This content is accessible to all versions of every browser. However, this browser does not seem to support current Web standards, preventing the display of our site's design details.


Finite-dimensional approximation of regression with gaussian processes


G. Ferrari-Trecate

Zurich, Switzerland, ETH Zurich

The Bayesian analysis of neural networks is difficult because the prior over functions has usually a complex form, leading to implementation that either make approximations or use Monte Carlo integration techniques. A notable exception is the use of Gaussian Processes (GP) priors over function which permit the Bayesian analysis to be carried out exactly using matrix operations. However, GP prediction suffers from O(n^3) scaling with the data set size n. By using a finite-dimensional basis to approximate the GP predictor, the computational complexity can be reduced. We derive optimal finite-dimensional predictors under a number of assumptions, and show the superiority of these predictors over the Projected Bayes Regression method . We also show how to calculate the minimal model size for a given n. The calculations are backed up by numerical experiments.


Type of Publication:


No Files for download available.
% No recipe for automatically generating a BibTex entry for (06)Talk
Permanent link