4D Imaging through Spray-On Optics

Julian Iseringhausen, Bastian Goldlücke, Nina Pesheva, Stanimir Iliev, Alexander Wender, Martin Fuchs, and Matthias B. Hullin
In: ACM Trans. Graph. (Proc. SIGGRAPH 2017) (to appear) (2017), 36:4
 

Abstract

Light fields are a powerful concept in computational imaging and a mainstay in image-based rendering; however, so far their acquisition required either carefully designed and calibrated optical systems (micro-lens arrays), or multi-camera/multi-shot settings. Here, we show that fully calibrated light field data can be obtained from a single ordinary photograph taken through a partially wetted window. Each drop of water produces a distorted view on the scene, and the challenge of recovering the unknown mapping from pixel coordinates to refracted rays in space is a severely underconstrained problem. The key idea behind our solution is to combine ray tracing and low-level image analysis techniques (extraction of 2D drop contours and locations of scene features seen through drops) with state-of-the-art drop shape simulation and an iterative refinement scheme to enforce photo-consistency across features that are seen in multiple views. This novel approach not only recovers a dense pixel-to-ray mapping, but also the refractive geometry through which the scene is observed, to high accuracy. We therefore anticipate that our inherently self-calibrating scheme might also find applications in other fields, for instance in materials science where the wetting properties of liquids on surfaces are investigated.

Images

Download Paper

Download Paper

Bibtex

@ARTICLE{IseringhausenSIG2017,
    author = {Iseringhausen, Julian and Goldl{\"u}cke, Bastian and Pesheva, Nina and Iliev, Stanimir and Wender,
              Alexander and Fuchs, Martin and Hullin, Matthias B.},
     title = {4D Imaging through Spray-On Optics},
   journal = {ACM Trans. Graph. (Proc. SIGGRAPH 2017) (to appear)},
    volume = {36},
    number = {4},
      year = {2017},
  abstract = {Light fields are a powerful concept in computational imaging and a mainstay in image-based
              rendering; however, so far their acquisition required either carefully designed and calibrated
              optical systems (micro-lens arrays), or multi-camera/multi-shot settings. Here, we show that fully
              calibrated light field data can be obtained from a single ordinary photograph taken through a
              partially wetted window. Each drop of water produces a distorted view on the scene, and the
              challenge of recovering the unknown mapping from pixel coordinates to refracted rays in space is a
              severely underconstrained problem. The key idea behind our solution is to combine ray tracing and
              low-level image analysis techniques (extraction of 2D drop contours and locations of scene features
              seen through drops) with state-of-the-art drop shape simulation and an iterative refinement scheme
              to enforce photo-consistency across features that are seen in multiple views. This novel approach
              not only recovers a dense pixel-to-ray mapping, but also the refractive geometry through which the
              scene is observed, to high accuracy. We therefore anticipate that our inherently self-calibrating
              scheme might also find applications in other fields, for instance in materials science where the
              wetting properties of liquids on surfaces are investigated.},
       doi = {http://dx.doi.org/10.1145/3072959.3073589}
}