Photogeometric Scene Flow for High-Detail Dynamic 3D Reconstruction

Authors

Paulo Gotardo (Disney Research Zurich)
Tomas Simon (Carnegie Mellon University)
Yaser Sheikh (Carnegie Mellon University)
Iain Matthews (Disney Research Pittsburgh)

International Conference on Computer Vision (ICCV) 2015

December 11, 2015

Photogeometric Scene Flow for High-Detail Dynamic 3D Reconstruction-Image

Photometric stereo (PS) is an established technique for high-detail reconstruction of 3D geometry and appearance. To correct for surface integration errors, PS is often combined with multiview stereo (MVS). With dynamic objects, PS reconstruction also faces the problem of computing optical flow (OF) for image alignment under rapid changes in illumination. Current PS methods typically compute optical flow and MVS as independent stages, each one with its own limitations and errors introduced by early regularization. In contrast, scene flow methods estimate geometry and motion but lack the fine detail from PS. This paper proposes photogeometric scene flow (PGSF) for high-quality dynamic 3D reconstruction. PGSF performs PS, OF, and MVS simultaneously. It is based on two key observations: (i) while image alignment improves PS, PS allows for surfaces to be relit to improve alignment; (ii) PS provides surface gradients that render the smoothness term in MVS unnecessary, leading to truly data-driven, continuous depth estimates. This synergy is demonstrated in the quality of the resulting RGB appearance, 3D geometry, and 3D motion.

Download File "Photogeometric Scene Flow for High-Detail Dynamic 3D Reconstruction-Paper"
[pdf, 17.10 MB]

Additional files

Download additional file "Photogeometric Scene Flow for High-Detail Dynamic 3D Reconstruction-Supplementary Material"
[pdf, 7.77 MB]

Download additional file "ICCV 2015 Poster"
[pdf, 31.87 MB]

Copyright Notice

The documents contained in these directories are included by the contributing authors as a means to ensure timely dissemination of scholarly and technical work on a non-commercial basis. Copyright and all rights therein are maintained by the authors or by other copyright holders, notwithstanding that they have offered their works here electronically. It is understood that all persons copying this information will adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.