Files

Abstract

This paper addresses the problem of dense estimation of disparities between omnidirectional images, in a spherical framework. Omnidirectional imaging certainly represents important advantages for the representation and processing of the plenoptic function in 3D scenes for applications in localization, or depth estimation for example. In this context, we propose to perform disparity estimation directly in a spherical framework, in order to avoid discrepancies due to inexact projections of omnidirectional images onto planes. We first perform rectification of the omnidirectional images in the spherical domain. Then we develop a global energy minimization algorithm based on the graph-cut algorithm, in order to perform disparity estimation on the sphere. Experimental results show that the proposed algorithm outperforms typical methods as the ones based on block matching, for both a simple synthetic scene, and complex natural scenes. The proposed method shows promising performances for dense disparity estimation and can be extended efficiently to networks of several camera sensors.

Details

Actions

Preview