Worst-Case Bounds for Gaussian Process Models

We present a competitive analysis of some non-parametric Bayesian algorithms in a worst-case online learning setting, where no probabilistic assumptions about the generation of the data are made. We consider models which use a Gaussian process prior (over the space of all functions) and provide bounds on the regret (under the log loss) for commonly used non-parametric Bayesian algorithms --- including Gaussian regression and logistic regression --- which show how these algorithms can perform favorably under rather general conditions. These bounds explicitly handle the infinite dimensionality of these non-parametric classes in a natural way. We also make formal connections to the minimax and minimum description length (MDL) framework. Here, we show precisely how Bayesian Gaussian regression is a minimax strategy.


Published in:
Proceedings of the 19th Annual Conference on Neural Information Processing Systems
Presented at:
Neural Information Processing Systems 18
Year:
2006
Keywords:
Laboratories:




 Record created 2010-12-01, last modified 2018-09-25

n/a:
Download fulltext
PDF

Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)