Fast Variational Bayesian Inference for Non-Conjugate Matrix Factorization Models
Probabilistic matrix factorization methods aim to extract meaningful correlation structure from an incomplete data matrix by postulating low rank constraints. Recently, variational Bayesian (VB) inference techniques have successfully been applied to such large scale bilinear models. However, current algorithms are of the alternate updating or stochastic gradient descent type, slow to converge and prone to getting stuck in shallow local minima. While for MAP or maximum margin estimation, singular value shrinkage algorithms have been proposed which can far outperform alternate updating, this methodological avenue remains unexplored for Bayesian techniques. In this paper, we show how to combine a recent singular value shrinkage characterization of fully observed spherical Gaussian VB matrix factorization with local variational bounding in order to obtain efficient VB inference for general MF models with non-conjugate likelihood potentials. In particular, we show how to handle Poisson and Bernoulli potentials, far more suited for most MF applications than Gaussian likelihoods. Our algorithm can be run even for very large models and is easily implemented in {\em Matlab}. It exhibits significantly better prediction performance than MAP estimation on a range of real-world datasets.
aistats2012camera-ready_submitted.pdf
openaccess
297.32 KB
Adobe PDF
3da2cafb0608541adaa698a3cb69bd24