000214918 001__ 214918
000214918 005__ 20180913063514.0
000214918 020__ $$a978-1-4799-9988-0
000214918 02470 $$2ISI$$a000388373401059
000214918 037__ $$aCONF
000214918 245__ $$aGraph-Based Representation and Coding of Multiview Images
000214918 269__ $$a2016
000214918 260__ $$bIeee$$c2016$$aNew York
000214918 300__ $$a5
000214918 336__ $$aConference Papers
000214918 490__ $$aInternational Conference on Acoustics Speech and Signal Processing ICASSP
000214918 520__ $$aInstead of lossily coding depth images resulting in undesirable geometric distortion, graph-based representation (GBR) describes disparity information as a graph with a controllable accuracy. In this paper, we propose a more compact graphical representation called GBR-plus to code both disparity and color information of a target view given a reference view. Specifically, first we differentiate between disocclusion holes (occluded spatial regions in the reference view) and rounding holes (insufficiently sampled regions in the reference view) in the synthesized target view, so that the decoder can optionally complete rounding holes via signal interpolation without coding overhead. Second, we use a compact graphical representation to delimit disparity-shifted boundaries of objects in the target view, which is coded losslessly. Finally, color pixels in disocclusion holes are predicted using adjacent background pixels as predictors, and prediction residuals in a local neighborhood are coded using Graph Fourier Transform (GFT). Experimental results show that GBR-plus outperforms previous GBR, and has comparable performance as HEVC at mid to high bitrates with lower encoder complexity.
000214918 6531_ $$a3D imaging
000214918 6531_ $$agraphical representation
000214918 6531_ $$agraph Fourier transform
000214918 700__ $$aMotz, Bénédicte
000214918 700__ $$aCheung, Gene
000214918 700__ $$g101475$$aFrossard, Pascal$$0241061
000214918 7112_ $$aIEEE ICASSP
000214918 773__ $$tProceedings of IEEE ICASSP
000214918 909C0 $$xU10851$$0252393$$pLTS4
000214918 909CO $$pconf$$pSTI$$ooai:infoscience.tind.io:214918
000214918 917Z8 $$x101475
000214918 917Z8 $$x101475
000214918 937__ $$aEPFL-CONF-214918
000214918 973__ $$rREVIEWED$$sPUBLISHED$$aEPFL
000214918 980__ $$aCONF