Omnidirectional views selection for scene representation

This paper proposes a new method for the selection of sets of omnidirectional views, which contribute together to the efficient representation of a 3d scene. When the 3d surface is modelled as a function on a unit sphere, the view selection problem is mostly governed by the accuracy of the 3d surface reconstruction from non-uniformly sampled datasets. A novel method is proposed for the reconstruction of signals on the sphere from scattered data, using a generalization of the Spherical Fourier Transform. With that reconstruction strategy, an algorithm is then proposed to select the best subset of $n$ views, from a predefined set of viewpoints, in order to minimize the overall reconstruction error. Starting from initial viewpoints determined by the frequency distribution of the 3d scene, the algorithm iteratively refines the selection of each of the viewpoints, in order to maximize the quality of the representation. Experiments show that the algorithm converges towards a minimal distortion, and demonstrate that the selection of omnidirectional views is consistent with the frequency characteristics of the 3d scene.

Published in:
Proceedings of the IEEE International Conference on Image Processing
Presented at:
IEEE International Conference on Image Processing, Atlanta, October, 2006

Note: The status of this file is: Anyone

 Record created 2006-06-14, last modified 2020-07-30

Download fulltext

Rate this document:

Rate this document:
(Not yet reviewed)