WASP: Scalable Bayes via barycenters of subset posteriors

The promise of Bayesian methods for big data sets has not fully been realized due to the lack of scalable computational algorithms. For massive data, it is necessary to store and process subsets on different machines in a distributed manner. We propose a simple, general, and highly efficient approach, which first runs a posterior sampling algorithm in parallel on different machines for subsets of a large data set. To combine these subset posteriors, we calculate the Wasserstein barycenter via a highly efficient linear program. The resulting estimate for the Wasserstein posterior (WASP) has an atomic form, facilitating straightforward estimation of posterior summaries of functionals of interest. The WASP approach allows posterior sampling algorithms for smaller data sets to be trivially scaled to huge data. We provide theoretical justification in terms of posterior consistency and algorithm efficiency. Examples are provided in complex settings including Gaussian process regression and nonparametric Bayes mixture models.

Presented at:
The 18th International Conference on Artificial Intelligence and Statistics, San Diego, USA, May 9-12, 2015

 Record created 2015-01-27, last modified 2018-03-17

Download fulltextPDF
Download fulltextPDF
Rate this document:

Rate this document:
(Not yet reviewed)