How to make simple object tracking to pick & place products with Delta X Software

Поділитися
Вставка
  • Опубліковано 29 жов 2024

КОМЕНТАРІ • 17

  • @coxmichaels
    @coxmichaels 4 роки тому

    Thank you for posting this!!! I was just working on trying to get this figured out last night.

  • @DeltaXRobot
    @DeltaXRobot  4 роки тому +2

    - I got the wrong direction of the X-axis at 0:44
    Correct: declare conveyor speed and direction to the camera

    • @heartflame503
      @heartflame503 3 роки тому

      Can I suggest a project for you that will make you guys go Viral ? if its does I will ask that you guys make a product for me ;-)

    • @DeltaXRobot
      @DeltaXRobot  3 роки тому

      @@heartflame503 Please email details of what you would like to discuss with me: deltaxrobot@gmail.com

  • @romanlisiecki616
    @romanlisiecki616 3 роки тому +1

    Hi nice video! Just one question: how does the software recognized the depth of the object? The risk is that it either crashes the obeject (if Z-axis is too low) or that vaccum fails to catch it (if Z-axis is too high) - how do you solve this problem? Any depth detection?

  • @sebastianvalencia9834
    @sebastianvalencia9834 8 місяців тому

    I want to select defective coffee beans from a sample of beans that come on a conveyor belt. Can I do it with this software?

  • @Abaddon3x7
    @Abaddon3x7 3 роки тому

    In the video its failing some times to pick up the item, does it know when its failed? If not, if it was picking products you have boxes missing items.

    • @DeltaXRobot
      @DeltaXRobot  3 роки тому

      The picking was wrong because I still haven't calibrated the camera well. Choose a better camera, better calibration, and write a better program when you do real projects.

  • @jeffinsvarghese8907
    @jeffinsvarghese8907 3 роки тому

    we are doing a similar project but instead of picking sweets we are trying to spray pesticides over the weed. for detection we are using yolo v4 and the the detection is completed and we are getting the output coordinates. how can we use this delta arm to spary over the detected weed.

    • @DeltaXRobot
      @DeltaXRobot  3 роки тому

      You can connect your embedded computer to the robot via usb port and send G-code to control the robot.

    • @jeffinsvarghese8907
      @jeffinsvarghese8907 3 роки тому

      @@DeltaXRobot Can you explain how is the coordinates converted to g code...and camera calibration with respect to ground is done....

    • @DeltaXRobot
      @DeltaXRobot  3 роки тому +1

      @@jeffinsvarghese8907 For example, the camera is located at position x = 150, y = 20 relative to the coordinates of the robot.
      Object detected at x '= 20, y' = 50 relative to the camera.
      So the coordinates of the object relative to the robot is X = x + x '= 150 + 20 = 170, Y = y + y' = 20 + 50 = 70.
      The G-code command to ask the robot to go to position X = 170, Y = 70 is
      "G01 X170 Y70"

  • @ericjing7532
    @ericjing7532 4 роки тому

    Hi, nice, where did you buy the ball joint

  • @MarcSallent
    @MarcSallent 4 роки тому

    Interesting! Which robot are you using in this demo? The Delta X or Delta X 2?

    • @DeltaXRobot
      @DeltaXRobot  4 роки тому +1

      It's Delta X 2

    • @NETBotic
      @NETBotic 3 роки тому

      @@DeltaXRobot Can the X1 do this if the camera is added?

  • @williamhuang5329
    @williamhuang5329 2 роки тому

    Hanzhen harmonic drive gear ,
    robot joint , over 30 years experience