Gaussian processes are a natural way of defining prior distributions over functions of one or more input variables. In a simple nonparametric regression problem, where such a function gives the mean of a Gaussian distribution for an observed response, a Gaussian process model can easily be implemented using matrix computations that are feasible for datasets of up to about a thousand cases. Hyperparameters that define the covariance function of the Gaussian process can be sampled using Markov chain methods. Regression models where the noise has a t distribution and logistic or probit models for classification applications can be implemented by sampling as well for latent values underlying the observations. Software is now available that implements these methods using covariance functions with hierarchical parameterizations. Models defined in this way can discover high-level properties of the data, such as which inputs are relevant to predicting the response.
Technical Report No. 9702, Dept. of Statistics (January 1997), 24 pages: postscript, pdf, associated software.
Also available from arXiv.org.
Neal, R. M. (1998) ``Regression and classification using Gaussian process priors'' (with discussion), in J. M. Bernardo, et al (editors) Bayesian Statistics 6, Oxford University Press, pp. 475-501: abstract, postscript (without discussion), pdf (without discussion), associated software.