Autonomous Aerial Navigation Using Monocular Visual-Inertial Fusion

Поділитися
Вставка
  • Опубліковано 21 сер 2024
  • A monocular visual-inertial navigation system (VINS), consisting only of an inertial measurement unit (IMU) and a camera, becomes the most suitable sensor suite in this case, thanks to its ultra-light weight and small footprint. In fact, it is the minimum sensor suite allowing autonomous flight with sufficient environmental awareness. In this work, we show that it is possible to achieve reliable online autonomous navigation using monocular visual-inertial fusion. Our system is built on a customized quadrotor testbed equipped with a fisheye camera, a low-cost IMU, and heterogeneous onboard computing resources. The backbone of our system is a highly-accurate optimization-based monocular visual-inertial state estimator with online initialization and self-extrinsic calibration. An onboard GPU-based monocular dense mapping module that conditions on the estimated pose provides wide-angle situational awareness. Finally, an online trajectory planner that operates directly on the incrementally-built 3D map guarantees safe navigation through cluttered environments. Extensive experimental results are provided to validate individual system modules as well as the overall performance in both indoor and outdoor environments.
    The monocular visual-inertial state estimator we used for our aerial robot is the same as the one we demonstrated for mobile AR ( • [Open Source] VINS-Mob... ). Source code is available on GitHub: github.com/HKU...

КОМЕНТАРІ • 5