VGN: Real-Time 6 DOF Grasp Detection in Clutter

Поділитися
Вставка
  • Опубліковано 4 січ 2021
  • Supplementary video of our CoRL 2020 submission, "Volumetric Grasping Network: Real-Time 6 DOF Grasp Detection in Clutter".
    arXiv: arxiv.org/abs/2101.01132
    Github: github.com/ethz-asl/vgn
    Abstract - General robot grasping in clutter requires the ability to synthesize grasps that work for previously unseen objects and that are also robust to physical interactions, such as collisions with other objects in the scene. In this work, we design and train a network that predicts 6 DOF grasps from 3D scene information gathered from an on-board sensor such as a wrist-mounted depth camera. Our proposed Volumetric Grasping Network (VGN) accepts a Truncated Signed Distance Function (TSDF) representation of the scene and directly outputs the predicted grasp quality and the associated gripper orientation and opening width for each voxel in the queried 3D volume. We show that our approach can plan grasps in only 10 ms and is able to clear 92% of the objects in real-world clutter removal experiments without the need for explicit collision checking. The real-time capability opens up the possibility for closed-loop grasp planning, allowing robots to handle disturbances, recover from errors and provide increased robustness.
  • Наука та технологія

КОМЕНТАРІ • 2

  •  3 роки тому

    Thank you for sharing.

  • @wizardOfRobots
    @wizardOfRobots 3 роки тому

    Very cool. Moving objects needs a different approach likely.