Infoscience

Student project

Bayesian methods for Support Vector machines and Gaussian processes

We present a common probabilistic framework for kernel or spline smooth- ing methods, including popular architectures such as Gaussian processes and Support Vector machines. We identify the problem of unnormalized loss func- tions and suggest a general technique to overcome this problem at least ap- proximately. We give an intuitive interpretation of the effect an unnormalized loss function can induce, by comparing Support Vector classification (SVC) with Gaussian process classification (GPC) as a nonparametric generalization of logistic regression. This interpretation relates SVC to boosting techniques. We propose a variational Bayesian model selection algorithm for general nor- malized loss functions. This algorithm has a wider applicability than other previously suggested Bayesian techniques and exhibits comparable perfor- mance in cases where both techniques are applicable. We present and discuss results of a substantial number of experiments in which we applied the vari- ational algorithm to common real-world classification tasks and compared it to a range of other known methods. The wider scope of this thesis is to provide a bridge between the fields of probabilistic Bayesian techniques and Statistical Learning Theory, and we present some material of tutorial nature which we hope will be useful to researchers of both fields.

Related material