Online Depth Calibration for RGB-D Cameras using Visual SLAM

Поділитися
Вставка
  • Опубліковано 14 лис 2024
  • Video attachment for paper
    Jan Quenzel, Radu Alexandru Rosu, Sebastian Houben, and Sven Behnke:
    "Online Depth Calibration for RGB-D Cameras using Visual SLAM"
    IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, Canada, September 2017.
    www.ais.uni-bon...
    Modern consumer RGB-D cameras are affordable
    and provide dense depth estimates at high frame rates. Hence,
    they are popular for building dense environment representations.
    Yet, the sensors often do not provide accurate depth
    estimates since the factory calibration exhibits a static deformation.
    We present a novel approach to online depth calibration
    that uses a visual SLAM system as reference for the measured
    depth. A sparse map is generated and the visual information
    is used to correct the static deformation of the measured depth
    while missing data is extrapolated using a small number of thin
    plate splines (TPS). The corrected depth can then be used to
    improve the accuracy of the sparse RGB-D map and the 3D
    environment reconstruction. As more data becomes available,
    the depth calibration is updated on the fly. Our method does
    not rely on a planar geometry like walls or a one-to-one-pixel
    correspondence between color and depth camera. Our approach
    is evaluated in real-world scenarios and against ground truth
    data. Comparison against two popular self-calibration methods
    is performed. Furthermore, we show clear visual improvement
    on aggregated point clouds with our method.

КОМЕНТАРІ • 2

  • @OneisneO
    @OneisneO 7 років тому

    Fantastic results!