Balanced Distributed Coding of Omnidirectional Images
This paper presents a distributed coding scheme for the representation of 3D scenes captured by a network of omnidirectional cameras. We consider a scenario where images captured at different viewpoints are encoded independently, with a balanced rate distribution among the different cameras. The distributed coding is built on multiresolution representation and partitioning of the visual information in each camera. The encoder then transmits one partition after entropy coding, as well as the syndrome bits resulting from the channel encoding of the other partition. The joint decoder exploits the intra-view correlation by predicting the missing source information with help of the syndrome bits. At the same time, it exploits the inter-view correlation by using motion estimation between images from different cameras. Experiments demonstrate that the distributed coding solution performs better than a scheme where images are handled independently, while the coding rate advantageously stays balanced between encoders.