Just happened across this short, intriguing robot for grape harvesting, I would like to know more on its development path and when it will be commercially available, hiring of seasonal staff is becoming increasingly difficult.
very cool, good work. Looks like YOLOv2 worked well for detecting grape clustered objects. What happens if the grape clusters are non-uniform in size, shape, or occluded by vine leaves?
We had so many things to juggle that we didn't get to the part with the occluded clusters, but the stem should always be at the top aligned with the center of mass no matter the size and shape. That's what we're targeting. I think the next plan was to start training to find specific parts of the vine rather than use just the cluster for indexing.
Not yet, it was such a time crunch to get things finished that the repo is pretty ugly right now. We've been using some of our findings to help direct the path of a new robotics/computer vision elective for the EE program at BCIT, so I'll need to check with them before we officially share things. I'll let you know!
Very cool project! How did you solve the problem of having both lidar and arm on one chassis? I also am doing similar project but cannot see how to mount lidar on it so it does not interfere with arm
I need to find someone who can help me make a autonomous robot arm that can harvest grapes but it knows witch ones are good and bad and then it does the juicing process for me and then makes the drink so I don’t have to do anything please I want this.
Sure! Everything was done in Python3. Any image pre-processing and display was using OpenCV, video capture was using a python video4linux api. Training and inferencing on the machine learning side was using Darknet and YoloV2 respectively (Specifically AlexeyAB's fork). The image processing was done on a Jetson TX2. The colour camera is a logitech c922x, the depth camera was a camboard pico flexx. The lidar is an RPLidarA2, the SLAM library was EKF SLAM. Mission planning and arm control was done on a Raspberry Pi 3B+. The arm is a uArm Swift Pro using their Python3 SDK2.0. The vehicle was a actrobotics nomad. Let me know if you would like to know anything more specific.
Hello We are a public smart farm R&D project group located in Korea. While we were producing a promotional video to promote smart farm technology, we sent you an e-mail to ask if we could use the video uploaded to your UA-cam channel. The source link to the image used will be inserted into the video. It's not for commercial use, it's for public purposes. Please reply.
Just happened across this short, intriguing robot for grape harvesting, I would like to know more on its development path and when it will be commercially available, hiring of seasonal staff is becoming increasingly difficult.
I want to know some more information about this project
very cool, good work. Looks like YOLOv2 worked well for detecting grape clustered objects. What happens if the grape clusters are non-uniform in size, shape, or occluded by vine leaves?
We had so many things to juggle that we didn't get to the part with the occluded clusters, but the stem should always be at the top aligned with the center of mass no matter the size and shape. That's what we're targeting. I think the next plan was to start training to find specific parts of the vine rather than use just the cluster for indexing.
Your robot is amazing, I am working on something similar. Possible to scale up for fruit tree harvesting?
Did you have a git-hub repo? It's a very nice work
Sorry for bothering you. I also want to do a project similar to yours. Can you tell me what you use to make the robot move automatically? Thank you.
holy cow, do you guys have an open repo ??
Not yet, it was such a time crunch to get things finished that the repo is pretty ugly right now. We've been using some of our findings to help direct the path of a new robotics/computer vision elective for the EE program at BCIT, so I'll need to check with them before we officially share things. I'll let you know!
What camera u used?
Very cool project!
How did you solve the problem of having both lidar and arm on one chassis?
I also am doing similar project but cannot see how to mount lidar on it so it does not interfere with arm
Great question! We actually programmed in a blind spot to account for the chassis and arm. SLAM can still build the map with a partial scan.
@@terrycalderbank7925 Thanks. That's interesting and challenging algorithm to write!
help me in making this project for my final year project..
This is awesome!
I need to find someone who can help me make a autonomous robot arm that can harvest grapes but it knows witch ones are good and bad and then it does the juicing process for me and then makes the drink so I don’t have to do anything please I want this.
Amazing work guys! Could you share any of the software packages and hardware you used? I'm very curious on how it works.
Sure! Everything was done in Python3. Any image pre-processing and display was using OpenCV, video capture was using a python video4linux api. Training and inferencing on the machine learning side was using Darknet and YoloV2 respectively (Specifically AlexeyAB's fork). The image processing was done on a Jetson TX2. The colour camera is a logitech c922x, the depth camera was a camboard pico flexx. The lidar is an RPLidarA2, the SLAM library was EKF SLAM. Mission planning and arm control was done on a Raspberry Pi 3B+. The arm is a uArm Swift Pro using their Python3 SDK2.0. The vehicle was a actrobotics nomad. Let me know if you would like to know anything more specific.
Terry Calderbank can this work for cotton harvesting as well
@@terrycalderbank7925 Why did you use Raspberry Pi 3B+ for "mission planning and arm control" instead of Jetson TX2
More info?
Can you share a repo for this project?
Hey bro I want help of you
Hello
We are a public smart farm R&D project group located in Korea. While we were producing a promotional video to promote smart farm technology, we sent you an e-mail to ask if we could use the video uploaded to your UA-cam channel.
The source link to the image used will be inserted into the video.
It's not for commercial use, it's for public purposes.
Please reply.
Esta lindo lastima que en la vida real los racimos de uva estan enredados en la planta y alambre. Y lo dice alguien que trabaja en la agricultura