Automatic View Synthesis by Image-Domain-Warping
Nikolce Stefanoski (Disney Research Zurich)
Oliver Wang (Disney Research Zurich)
Manuel Lang (Disney Research, Zürich/ETH Joint PhD)
Simon Heinzle (Disney Research Zurich)
Aljoscha Smolic (Disney Research Zurich)
Today, stereoscopic 3D (S3D) cinema is already mainstream, and almost all new display devices for the home support S3D content. S3D distribution infrastructure to the home is partly already established in form of 3D Blu-ray discs, video on demand services, or television channels. However, the necessity to wear glasses is often considered as an obstacle, which hinders broader acceptance of this technology in the home. Multiview autostereoscopic displays enable a glasses free perception of S3D content for several observers simultaneously, and support head motion parallax in a limited range. In order to support multiview autostereoscopic dispays in an already established S3D distribution infrastructure, a synthesis of new views from S3D video is needed. In this paper, a view synthesis method based on Image-domain-Warping (IDW) is presented which synthesizes new views directly from S3D video and functions completely automatically. IDW relies on an automatic and robust estimation of sparse disparities and image saliency information, and enforces target disparities in synthesized images using an image warping framework. Two configurations of the view synthesizer in the scope of a transmission and view synthesis framework are analyzed and evaluated. A transmission and view synthesis system that uses IDW was recently submitted to MPEG’s call for proposals on 3D Video Technology, where it was ranked among the four best performing proposals.