Infoscience

Thesis

Subtree-based bounds and simulation-based estimations for the partition function

Many different algorithms developed in statistical physics, coding theory, signal processing, and artificial intelligence can be expressed by graphical models and solved (either exactly or approximately) with iterative message-passing algorithms on the model. One quantity of interest in these algorithms is the partition function. In graphical models without cycle (trees), the partition function can be computed efficiently by means of message-passing algorithms, like GDL or the sum-product algorithm. In contrast, when graphical models contain cycles, the computation of the partition function is in general intractable. Our contributions in this dissertation are: deriving deterministic upper and lower bounds on partition function, and developing methods to approximate this quantity with Monte Carlo methods. Specifically, we obtain subtree-based upper and lower bounds which lead to theoretical results on optimality properties of the minimum entropy sub-tree and finally lead to a greedy algorithm. At last, we introduce and analyze a number of estimators that use Gibbs sampling to draw samples from different target distributions to estimate the value of the partition function. In one estimator, we demonstrate a novel strategy which combines Gibbs sampling and message-passing algorithms on trees.

Related material