Bayesian classification with Gaussian processes

Abstract

We consider the problem of assigning an input vector <span class='mathrm'>bfx</span> to one of <span class='mathrm'>m</span> classes by predicting <span class='mathrm'>P(c|bfx)</span> for <span class='mathrm'>c = 1, ldots, m</span>. For a two-class problem, the probability of class 1 given <span class='mathrm'>bfx</span> is estimated by <span class='mathrm'>s(y(bfx))</span>, where <span class='mathrm'>s(y) = 1/(1 + e<sup>-y</sup>)</span>. A Gaussian process prior is placed on <span class='mathrm'>y(bfx)</span>, and is combined with the training data to obtain predictions for new <span class='mathrm'>bfx</span> points. We provide a Bayesian treatment, integrating over uncertainty in <span class='mathrm'>y</span> and in the parameters that control the Gaussian process prior; the necessary integration over <span class='mathrm'>y</span> is carried out using Laplace's approximation. The method is generalized to multi-class problems <span class='mathrm'>(m &gt;2)</span> using the softmax function. We demonstrate the effectiveness of the method on a number of datasets.

Divisions: Aston University (General)
Uncontrolled Keywords: assigning,input vector,probability,Gaussian process,training data,predictions,Bayesian treatment prior,uncertainty,Laplace,approximation,multi-class problems,softmax function
ISBN: NCRG/7/015
Last Modified: 08 Jan 2024 10:00
Date Deposited: 11 Mar 2019 17:21
PURE Output Type: Technical report
Published Date: 1997-12-13
Authors: Williams, Christopher K. I.
Barber, David

Export / Share Citation


Statistics

Additional statistics for this record