Files

Abstract

We show how variational Bayesian inference can be implemented for very large binary classification generalized linear models. Our relaxation is shown to be a convex problem for any log-concave model, and we provide an efficient double loop algorithm for solving it. Scalability is attained by decoupling the criterion, so that most of the work can be done by solving large linear systems. We employ our method for Bayesian active learning on large binary classification tasks and provide an algorithm to efficiently update our posterior representation when new observations are sequentially included.

Details

Actions

Preview