Shape from Texture for Omnidirectional Images

In this paper, we describe a method to infer 3-D scene information using a single view captured by an omnidirectional camera. The proposed technique is inscribed in the so called ``Shape from Texture'' problem : if the textures hold by 3-D surfaces respect some a priori models, the deformation due to its projection in the image contains both local information about surface depth and orientation. To estimate this deformation, we adapt the work of Garding and Lindeberg to the case of spherical images processing. The planar multiscale procedure allowing the definition of precise texture descriptor is here replaced by a multiscale representation compatible with the compactness of the sphere. More precisely, the multiscale representation is obtained by filtering the data by dilated copies of a mother function. The spherical dilation introduced is the emph{gnomonic} dilation, a simple variation of the stereographic dilation due to Antoine and Vandergheynst. This dilation has a simple interpretation in terms of projective geometry. It fits precisely the transformation that the apparent omnidirectional image of an object follows when the distance of this object to the sensor changes. A spherical texture descriptor, close to a deformation tensor, is then defined thanks to the use of simple filters that act as smoothed differential operators on the data. Results are provided in the analysis of a synthetic example to illustrate the capacity of the proposed method.

Published in:
Proc. EUSIPCO'08
Presented at:
EUSIPCO 2008, Lausanne, Switzerland, August 25-29, 2008

Note: The status of this file is: Anyone

 Record created 2008-02-13, last modified 2020-07-30

Download fulltextPDF
External link:
Download fulltextURL
Rate this document:

Rate this document:
(Not yet reviewed)