Real-time Multi-material Reflectance Reconstruction for Large-scale Scenes under Uncontrolled Illumination from RGB-D Image Sequences

In proceedings of International Conference on 3D Vision (3DV), IEEE, pages 709-718, 2019
 

Abstract

Real-time reflectance reconstruction under uncontrolled illumination conditions is well-known to be a challenging task due to the complex interplay of scene geometry, surface reflectance and illumination. Nonetheless, recent works succeed in recovering both unknown reflectance and illumination in an uncontrolled setting. However, they are either limited regarding the scene complexity (single objects / homogeneous materials) or are not suitable for real-time applications. Our proposed method enables the recovery of heterogeneous surface reflectance (multiple objects and spatially varying materials) in complex scenes at real-time frame rates. We achieve this goal in the following way: First, we perform a 3D scene reconstruction from an input RGB-D stream in real-time. We then use a deep learning based method to estimate Ward BRDF parameters from observations gathered from individual segmented scene objects. Subsequently we refine these reflectance parameters to allow for spatial variations across the object surfaces. We evaluate our method on synthetic scenes and successfully apply it to real-world data.

Bilder

Paper herunterladen

Paper herunterladen

Zusätzliches Material

Bibtex

@INPROCEEDINGS{bode2019reflectance,
        author = {Bode, Lukas and Merzbach, Sebastian and Stotko, Patrick and Weinmann, Michael and Klein, Reinhard},
         pages = {709--718},
         title = {Real-time Multi-material Reflectance Reconstruction for Large-scale Scenes under Uncontrolled
                  Illumination from RGB-D Image Sequences},
     booktitle = {International Conference on 3D Vision (3DV)},
          year = {2019},
  organization = {IEEE},
      abstract = {Real-time reflectance reconstruction under uncontrolled illumination conditions is well-known to be
                  a challenging task due to the complex interplay of scene geometry, surface reflectance and
                  illumination. Nonetheless, recent works succeed in recovering both unknown reflectance and
                  illumination in an uncontrolled setting. However, they are either limited regarding the scene
                  complexity (single objects / homogeneous materials) or are not suitable for real-time applications.
                  Our proposed method enables the recovery of heterogeneous surface reflectance (multiple objects and
                  spatially varying materials) in complex scenes at real-time frame rates. We achieve this goal in the
                  following way: First, we perform a 3D scene reconstruction from an input RGB-D stream in real-time.
                  We then use a deep learning based method to estimate Ward BRDF parameters from observations gathered
                  from individual segmented scene objects. Subsequently we refine these reflectance parameters to
                  allow for spatial variations across the object surfaces. We evaluate our method on synthetic scenes
                  and successfully apply it to real-world data.},
           doi = {10.1109/3DV.2019.00083}
}