Autonomous Grape Harvesting Robot Development - BCIT Capstone Research Project

Поділитися
Вставка
  • Опубліковано 15 лис 2024

КОМЕНТАРІ • 25

  • @andyduhamel1925
    @andyduhamel1925 Рік тому

    Just happened across this short, intriguing robot for grape harvesting, I would like to know more on its development path and when it will be commercially available, hiring of seasonal staff is becoming increasingly difficult.

  • @WasimAkram-xe1yi
    @WasimAkram-xe1yi 21 день тому

    I want to know some more information about this project

  • @KyranFindlater
    @KyranFindlater 5 років тому +1

    very cool, good work. Looks like YOLOv2 worked well for detecting grape clustered objects. What happens if the grape clusters are non-uniform in size, shape, or occluded by vine leaves?

    • @terrycalderbank7925
      @terrycalderbank7925  5 років тому +2

      We had so many things to juggle that we didn't get to the part with the occluded clusters, but the stem should always be at the top aligned with the center of mass no matter the size and shape. That's what we're targeting. I think the next plan was to start training to find specific parts of the vine rather than use just the cluster for indexing.

  • @choon18
    @choon18 2 роки тому

    Your robot is amazing, I am working on something similar. Possible to scale up for fruit tree harvesting?

  • @edisonaltamirano618
    @edisonaltamirano618 5 років тому

    Did you have a git-hub repo? It's a very nice work

  • @minhkhanhphantruong3408
    @minhkhanhphantruong3408 3 роки тому

    Sorry for bothering you. I also want to do a project similar to yours. Can you tell me what you use to make the robot move automatically? Thank you.

  • @druidelf3
    @druidelf3 5 років тому +1

    holy cow, do you guys have an open repo ??

    • @terrycalderbank7925
      @terrycalderbank7925  5 років тому +2

      Not yet, it was such a time crunch to get things finished that the repo is pretty ugly right now. We've been using some of our findings to help direct the path of a new robotics/computer vision elective for the EE program at BCIT, so I'll need to check with them before we officially share things. I'll let you know!

  • @adiG_AtdiG
    @adiG_AtdiG Рік тому

    What camera u used?

  • @kirilchi
    @kirilchi 5 років тому

    Very cool project!
    How did you solve the problem of having both lidar and arm on one chassis?
    I also am doing similar project but cannot see how to mount lidar on it so it does not interfere with arm

    • @terrycalderbank7925
      @terrycalderbank7925  5 років тому +3

      Great question! We actually programmed in a blind spot to account for the chassis and arm. SLAM can still build the map with a partial scan.

    • @kirilchi
      @kirilchi 5 років тому

      @@terrycalderbank7925 Thanks. That's interesting and challenging algorithm to write!

  • @dhruvagupta9063
    @dhruvagupta9063 5 років тому +1

    help me in making this project for my final year project..

  • @New_World_Disorder
    @New_World_Disorder 3 роки тому

    This is awesome!

  • @gem_femboy6761
    @gem_femboy6761 2 роки тому

    I need to find someone who can help me make a autonomous robot arm that can harvest grapes but it knows witch ones are good and bad and then it does the juicing process for me and then makes the drink so I don’t have to do anything please I want this.

  • @pauloesperon7697
    @pauloesperon7697 5 років тому

    Amazing work guys! Could you share any of the software packages and hardware you used? I'm very curious on how it works.

    • @terrycalderbank7925
      @terrycalderbank7925  5 років тому +8

      Sure! Everything was done in Python3. Any image pre-processing and display was using OpenCV, video capture was using a python video4linux api. Training and inferencing on the machine learning side was using Darknet and YoloV2 respectively (Specifically AlexeyAB's fork). The image processing was done on a Jetson TX2. The colour camera is a logitech c922x, the depth camera was a camboard pico flexx. The lidar is an RPLidarA2, the SLAM library was EKF SLAM. Mission planning and arm control was done on a Raspberry Pi 3B+. The arm is a uArm Swift Pro using their Python3 SDK2.0. The vehicle was a actrobotics nomad. Let me know if you would like to know anything more specific.

    • @SoftNSweet98
      @SoftNSweet98 4 роки тому

      Terry Calderbank can this work for cotton harvesting as well

    • @lochuynh6734
      @lochuynh6734 Рік тому

      @@terrycalderbank7925 Why did you use Raspberry Pi 3B+ for "mission planning and arm control" instead of Jetson TX2

  • @vaishnavic6396
    @vaishnavic6396 4 роки тому

    More info?

  • @devshin8039
    @devshin8039 4 роки тому

    Can you share a repo for this project?

  • @mayurlondhe2083
    @mayurlondhe2083 5 років тому

    Hey bro I want help of you

  • @outtakefilm6245
    @outtakefilm6245 3 роки тому

    Hello
    We are a public smart farm R&D project group located in Korea. While we were producing a promotional video to promote smart farm technology, we sent you an e-mail to ask if we could use the video uploaded to your UA-cam channel.
    The source link to the image used will be inserted into the video.
    It's not for commercial use, it's for public purposes.
    Please reply.

  • @danieletcheverry1743
    @danieletcheverry1743 8 місяців тому

    Esta lindo lastima que en la vida real los racimos de uva estan enredados en la planta y alambre. Y lo dice alguien que trabaja en la agricultura