Learning for Structured Prediction Using Approximate Subgradient Descent with Working Sets

We propose a working set based approximate subgradient descent algorithm to minimize the margin-sensitive hinge loss arising from the soft constraints in max-margin learning frameworks, such as the structured SVM. We focus on the setting of general graphical models, such as loopy MRFs and CRFs commonly used in image segmentation, where exact inference is intractable and the most violated constraints can only be approximated, voiding the optimality guarantees of the structured SVM's cutting plane algorithm as well as reducing the robustness of existing subgradient based methods. We show that the proposed method obtains better approximate subgradients through the use of working sets, leading to improved convergence properties and increased reliability. Furthermore, our method allows new constraints to be randomly sampled instead of computed using the more expensive approximate inference techniques such as belief propagation and graph cuts, which can be used to reduce learning time at only a small cost of performance. We demonstrate the strength of our method empirically on the segmentation of a new publicly available electron microscopy dataset as well as the popular MSRC data set and show state-of-the-art results.

Presented at:
Conference on Computer Vision and Pattern Recognition (CVPR), Portland, Oregon, USA, June 23-28, 2013

 Record created 2013-03-18, last modified 2018-03-17

Supplementary material:
Download fulltextPDF
Download fulltextPDF
Rate this document:

Rate this document:
(Not yet reviewed)