Bicheng Luo1    Feng Xu1    Christian Richardt2    Jun-Hai Yong1

1 Tsinghua University       2 University of Bath

IEEE Transactions on Visualization and Computer Graphics (IEEE VR 2018)


Abstract

We propose a novel 360° scene representation for converting real scenes into stereoscopic 3D virtual reality content with head-motion parallax. Our image-based scene representation enables efficient synthesis of novel views with six degrees-of-freedom (6-DoF) by fusing motion fields at two scales: (1) disparity motion fields carry implicit depth information and are robustly estimated from multiple laterally displaced auxiliary viewpoints, and (2) pairwise motion fields enable real-time flow-based blending, which improves the visual fidelity of results by minimizing ghosting and view transition artifacts. Based on our scene representation, we present an end-to-end system that captures real scenes with a robotic camera arm, processes the recorded data, and finally renders the scene in a head-mounted display in real time (more than 40 Hz). Our approach is the first to support head-motion parallax when viewing real 360° scenes. We demonstrate compelling results that illustrate the enhanced visual experience – and hence sense of immersion – achieved with our approach compared to widely-used stereoscopic panoramas.

Downloads

Bibtex

@article{Parallax360,
  author    = {Bicheng Luo and Feng Xu and Christian Richardt and Jun-Hai Yong},
  title     = {Parallax360: Stereoscopic 360° Scene Representation for Head-Motion Parallax},
  journal   = {IEEE Transactions on Visualization and Computer Graphics},
  year      = {2018},
  volume    = {24},
  number    = {4},
  month     = apr,
  doi       = {10.1109/TVCG.2018.2794071},
  url       = {http://richardt.name/publications/parallax360/},
}