Dense Wide-Baseline Scene Flow From Two Handheld Video Cameras
Christian Richardt1,2,3 Hyeongwoo Kim1 Levi Valgaerts1 Christian Theobalt1
1 MPI Informatik 2 Intel Visual Computing Institute 3 University of Bath
International Conference on 3D Vision (3DV) 2016
Abstract
We propose a new technique for computing dense scene flow from two handheld videos with wide camera baselines and different photometric properties due to different sensors or camera settings like exposure and white balance. Our technique innovates in two ways over existing methods: (1) it supports independently moving cameras, and (2) it computes dense scene flow for wide-baseline scenarios. We achieve this by combining state-of-the-art wide-baseline correspondence finding with a variational scene flow formulation. First, we compute dense, wide-baseline correspondences using DAISY descriptors for matching between cameras and over time. We then detect and replace occluded pixels in the correspondence fields using a novel edge-preserving Laplacian correspondence completion technique. We finally refine the computed correspondence fields in a variational scene flow formulation. We show dense scene flow results computed from challenging datasets with independently moving, handheld cameras of varying camera settings.
Downloads
- Paper preprint (PDF, 9 MB)
- Supplemental document (PDF, 170 KB)
- Supplemental video (MP4, 198 MB)
- Presentation slides (PowerPoint, 163 MB)
- This paper on arXiv (arXiv:1609.05115)
Bibtex
@inproceedings{WideBaselineSceneFlow, author = {Christian Richardt and Hyeongwoo Kim and Levi Valgaerts and Christian Theobalt}, title = {Dense Wide-Baseline Scene Flow From Two Handheld Video Cameras}, booktitle = {International Conference on 3D Vision (3DV)}, year = {2016}, month = {October}, pages = {276--285}, doi = {10.1109/3DV.2016.36}, url = {http://richardt.name/publications/wide-baseline-scene-flow/}, }