A VR System for Immersive Teleoperation and Live Exploration with a Mobile Robot

Patrick Stotko, Stefan Krumpen, Max Schwarz, Christian Lenz, Sven Behnke, Reinhard Klein, and Michael Weinmann
In proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages 3630-3637, Nov. 2019
 

Abstract

Applications like disaster management and industrial inspection often require experts to enter contaminated places. To circumvent the need for physical presence, it is desirable to generate a fully immersive individual live teleoperation experience. However, standard video-based approaches suffer from a limited degree of immersion and situation awareness due to the restriction to the camera view, which impacts the navigation. In this paper, we present a novel VR-based practical system for immersive robot teleoperation and scene exploration. While being operated through the scene, a robot captures RGB-D data that is streamed to a SLAM-based live multi-client telepresence system. Here, a global 3D model of the already captured scene parts is reconstructed and streamed to the individual remote user clients where the rendering for e.g. head-mounted display devices (HMDs) is performed. We introduce a novel lightweight robot client component which transmits robot-specific data and enables a quick integration into existing robotic systems. This way, in contrast to first-person exploration systems, the operators can explore and navigate in the remote site completely independent of the current position and view of the capturing robot, complementing traditional input devices for teleoperation. We provide a proof-of-concept implementation and demonstrate the capabilities as well as the performance of our system regarding interactive object measurements and bandwidth-efficient data streaming and visualization. Furthermore, we show its benefits over purely video-based teleoperation in a user study revealing a higher degree of situation awareness and a more precise navigation in challenging environments.

Images

Download Paper

Download Paper

Additional Material

  • Video (MPEG-4 video, 9.4 MB)

Bibtex

@INPROCEEDINGS{stotko2019teleoperation,
     author = {Stotko, Patrick and Krumpen, Stefan and Schwarz, Max and Lenz, Christian and Behnke, Sven and Klein,
               Reinhard and Weinmann, Michael},
      pages = {3630--3637},
      title = {A VR System for Immersive Teleoperation and Live Exploration with a Mobile Robot},
  booktitle = {IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
       year = {2019},
      month = nov,
   abstract = {Applications like disaster management and industrial inspection often require experts to enter
               contaminated places. To circumvent the need for physical presence, it is desirable to generate a
               fully immersive individual live teleoperation experience. However, standard video-based approaches
               suffer from a limited degree of immersion and situation awareness due to the restriction to the
               camera view, which impacts the navigation. In this paper, we present a novel VR-based practical
               system for immersive robot teleoperation and scene exploration. While being operated through the
               scene, a robot captures RGB-D data that is streamed to a SLAM-based live multi-client telepresence
               system. Here, a global 3D model of the already captured scene parts is reconstructed and streamed to
               the individual remote user clients where the rendering for e.g. head-mounted display devices (HMDs)
               is performed. We introduce a novel lightweight robot client component which transmits robot-specific
               data and enables a quick integration into existing robotic systems. This way, in contrast to
               first-person exploration systems, the operators can explore and navigate in the remote site
               completely independent of the current position and view of the capturing robot, complementing
               traditional input devices for teleoperation. We provide a proof-of-concept implementation and
               demonstrate the capabilities as well as the performance of our system regarding interactive object
               measurements and bandwidth-efficient data streaming and visualization. Furthermore, we show its
               benefits over purely video-based teleoperation in a user study revealing a higher degree of
               situation awareness and a more precise navigation in challenging environments.},
        doi = {10.1109/IROS40897.2019.8968598}
}