Abstract

Variational methods have proved popular and effective for inference and learning in intractable graphical models. An attractive feature of the approaches based on the Kullback-Leibler divergence is a rigorous lower bound on the normalization constants in undirected models. In the suggested work we explore the idea of using auxiliary variables to improve on the lower bound of standard mean field methods. Our approach forms a more powerful class of approximations than any structured mean field technique. Furthermore, the existing lower bounds of the variational mixture models could be seen as computationally expensive special cases of our method. A byproduct of our work is an efficient way to calculate a set of mixture coefficients for any set of tractable distributions that principally improves on a flat combination.

Details

Actions