YOLOv8 OBB Training and ROS Implementation

Поділитися
Вставка
  • Опубліковано 10 січ 2025

КОМЕНТАРІ •

  • @MohamedAssanhaji
    @MohamedAssanhaji 4 місяці тому +4

    God bless you king gonna implement this for my visual servoing application with Universal robot 3, i implemented most of your tutorials, any advice for implementing this one (how can i send extracted position to the robot arm via ethernet and using python and YOLO v8 OBB + ROS) ?

    • @robotmania8896
      @robotmania8896  4 місяці тому

      Hi Mohamed Assanhaji!
      Thanks for watching my video!
      If you would like to send information using ethernet, you can use socket communication.
      Regarding the ethernet communication, code for this tutorial will help you.
      ua-cam.com/video/40p2avodFV0/v-deo.html

  • @Allmost40
    @Allmost40 8 годин тому

    How we know where is boltu top part, if we dont know it robot will make mistack when its placing
    can you please make a tutorial about it , find top part and robot will pick on part part of boltu

  • @iamshakeelsindho
    @iamshakeelsindho 2 дні тому

    Hi, Can you please create a video using Computer Vision (implementing different models, like YOLOv8 etc.) via IsaacROS/ROS2 inside IsaacSim Environment?
    Thanks

    • @robotmania8896
      @robotmania8896  День тому

      Thank you for the suggestion! I will consider it!

  • @petersobotta3601
    @petersobotta3601 4 місяці тому

    Awesome stuff! This is exactly what I need for my robot. Will it work on a Jetson Xavier using exactly the same steps?

    • @robotmania8896
      @robotmania8896  4 місяці тому +2

      Hi Peter Sobotta!
      Thanks for watching my video!
      Yes, it should work.

  • @FrancyLlamado
    @FrancyLlamado 4 місяці тому

    and may I ask why you did not use ROBOFLOW for annotating your data set, is using labelimg for annotation faster?

    • @robotmania8896
      @robotmania8896  4 місяці тому +1

      You may use whatever tool is convenient for you. I prefer using tools without any logins or via the Internet.

  • @ngkean9743
    @ngkean9743 3 місяці тому

    On the serious note, after detecting a certain object (Exp: Door Handle), with a RGBD Camera, can you pintpoint the location of the handle like X,Y and Z? Maybe demostrate a video

    • @robotmania8896
      @robotmania8896  3 місяці тому +1

      Yes, it is possible. I actually have several videos describing this technique. Here are several of them.
      ua-cam.com/video/--81OoXMvlw/v-deo.html
      ua-cam.com/video/oKaLyow7hWU/v-deo.html

    • @ngkean9743
      @ngkean9743 3 місяці тому

      Wow....tqtqtqttqtqtq

  • @Myfcollbodybuilding
    @Myfcollbodybuilding 4 місяці тому

    Is there a way to measure the speed of tracked moving object by using deepstream?

    • @robotmania8896
      @robotmania8896  4 місяці тому

      Hi Yang Liu!
      Thanks for watching my video!
      If you know real coordinates of object you are tracking, it easy to calculate velocity. But I don’t think that there is a module in deepstream to do that. Also, to do this, you have to use RGBD camera.

    • @Myfcollbodybuilding
      @Myfcollbodybuilding 4 місяці тому

      @@robotmania8896 Thanks

  • @FrancyLlamado
    @FrancyLlamado 4 місяці тому

    is there a way to generate synthetic data for YOLOV8OBB

    • @robotmania8896
      @robotmania8896  4 місяці тому

      Hi Francy Llamado!
      Thanks for watching my video!
      I have never done it, but you may use DALL·E to generate a dataset.

  • @ngkean9743
    @ngkean9743 3 місяці тому

    are....are you....THE CHOOSEN ONE!? I WILL SUBSCRIBE TO YOU, PLEASE SHOW MORE OF YOUR WISDOM

    • @robotmania8896
      @robotmania8896  3 місяці тому +1

      Hi Ng Kean!
      Thanks for watching my video!
      Yeah, I will try not to fall to the dark side...