NanoSLAM: Enabling Fully Onboard SLAM for Tiny Robots

Поділитися
Вставка
  • Опубліковано 20 гру 2024

КОМЕНТАРІ • 5

  • @krishnapranav9123
    @krishnapranav9123 Місяць тому

    This is vey insightful thankyou

  • @RC_Ira
    @RC_Ira Рік тому +1

    Very interesting video, awesome work!🤩👍

  • @Flare1107
    @Flare1107 10 місяців тому +1

    Is the loop closure scan and reference scan required because of a lack of precision with the odometery? Or is this typical of all robotics? I know there are plenty of issues for real world vs simulated location/trajectory in quadrupeds. If a more sensitive TOF sensor were used, maybe a rgb-d camera, would you still lack precision? Or could you include redundant positioning with a secondary IMU?

  • @vigneshbalaji21
    @vigneshbalaji21 Рік тому +1

    Very nice :) to take care of visual odometry drift with graph based approach. I have a doubt, can the onboard IMU be used to take care of this drift ? The main reason being for this graph to make a SLAM, it needs a closed graph path. Can IMU make a difference with a kalman filter maybe ?

  • @kunaldesarda1095
    @kunaldesarda1095 Рік тому

    Hey compliments on the research work.
    Just one question as you are using plain cardboard boxes and apparently there are not a lot of things in the maze, how is the drone being able to localize?