Wide-Baseline Foreground Object Interpolation Using Silhouette Shape Prior

We consider the synthesis of intermediate views of an object captured by two widely spaced and calibrated cameras. This problem is challenging because foreshortening effects and occlusions induce significant differences between the reference images when the cameras are far apart. That makes the association or disappearance/appearance of their pixels difficult to estimate. Our main contribution lies in disambiguating this ill-posed problem by making the interpolated views consistent with a plausible transformation of the object silhouette between the reference views. This plausible transformation is derived from an object-specific prior that consists of a nonlinear shape manifold learned from multiple previous observations of this object by the two reference cameras. The prior is used to estimate the evolution of the epipolar silhouette segments between the reference views. This information directly supports the definition of epipolar silhouette segments in the intermediate views, as well as the synthesis of textures in those segments. It permits to reconstruct the epipolar plane images (EPIs) and the continuum of views associated with the EPI volume, obtained by aggregating the EPIs. Experiments on synthetic and natural images show that our method preserves the object topology in intermediate views and deals effectively with the self-occluded regions and the severe foreshortening effect associated with wide-baseline camera configurations.

Published in:
Ieee Transactions On Image Processing, 26, 11, 5477-5490
Piscataway, Ieee-Inst Electrical Electronics Engineers Inc

 Record created 2017-10-11, last modified 2018-12-03

Rate this document:

Rate this document:
(Not yet reviewed)