Why I Built This Robot
Вставка
- Опубліковано 30 тра 2021
- AD For your chance to win a custom Tesla Model S and $20,000 and support a great cause, enter at omaze.com/jamesbruton.
It's time for an update on the Really Useful Robot and also show some updates I've been working on. I've managed to write some hacky scripts that pull the depth detection out of the Intel Realsense D435i camera, so I can pin point items in 3D space that are detected with the NVIDIA Inference/TensorRT deep learning models. This will allow the robot to manipulate specific items in the environment driving the inverse-kinematic model for the arm.
Walking Robot video: • My Walking Robotic His...
You can support me on Patreon or buy my Merchandise:
***************************
Patreon: / xrobots
Merchandise: teespring.com/stores/james-br...
***************************
Affiliate links - I will get some money of you use them to sign up or buy something:
***************************
Matterhackers 3D printing supplies: www.matterhackers.com?aff=7500
Music for your UA-cam videos: share.epidemicsound.com/xrobots
***************************
Other socials:
***************************
Instagram: / xrobotsuk
Facebook: / xrobotsuk
Twitter: / xrobotsuk
***************************
CAD and Code for my projects: github.com/XRobots
Huge thanks to my Patrons, without whom my standard of living would drastically decline. Like, inside out-Farm Foods bag decline. Plus a very special shoutout to Lulzbot, Inc who keep me in LulzBot 3D printers and support me via Patreon.
HARDWARE/SOFTWARE
Below you can also find a lot of the typical tools, equipment and supplies used in my projects:
Filament from: www.3dfuel.com/
Lulzbot 3D Printers: bit.ly/2Sj6nil
Lincoln Electric Welder: bit.ly/2Rqhqos
CNC Router: bit.ly/2QdsNjt
Ryobi Tools: bit.ly/2RhArcD
Axminster Micro Lathe: bit.ly/2Sj6eeN
3D Printer Filament: bit.ly/2PdcdUu
Soldering Iron: bit.ly/2DrNWDR
Vectric CNC Software: bit.ly/2zxpZqv
Why not join my community, who are mostly made up of actual geniuses. There’s a Facebook group and everything: / 287089964833488
XROBOTS
Former toy designer, current UA-cam maker and general robotics, electrical and mechanical engineer, I’m a fan of doing it yourself and innovation by trial and error. My channel is where I share some of my useful and not-so-useful inventions, designs and maker advice. Iron Man is my go-to cosplay, and 3D printing can solve most issues - broken bolts, missing parts, world hunger, you name it.
XRobots is the community around my content where you can get in touch, share tips and advice, and more build FAQs, schematics and designs are also available. - Наука та технологія
5:30 Love, how the cup is briefly categorized as a toilet...
and then as a dining table
He was categorized as an umbrella and him and the chair together were categorized as a couch at around the same time.
It got drunk for a sec
and here I'm wondering why James would put his printer on a suitcase...
He was also a dog 10:41
"And although I'm not a great software developer, I feel I can copy and paste well enough to get robots to do what I want."! 😆👍
A man after my own heart
Thank you! This sentence makes me feel more adequate.
Ah, yes. The most important software development skill.
I feel called out and I'm not even past college yet.
Of course it is a bit more than that!
I just realized I've been following you since 2013-2014 or so, before I even had a YT channel to subscribe. lol
Keep it going!
@James Bruton , the Realsense library supports converting from one sensors pixel/space/point to another sensors pixel/space/point fia the sensors intrinsic/extrinsic. This way you can completely remove the parallax error between the rgb sensor and the depth sensor when a depth is found for that pixel (the first IR camera is the center of the depth sensors). The nice thing about these cameras is that they are per-calibrated and you can request these intrinsic/extrinsic for each sensor.
It's really insane what this guy can make. Even if an intire company was making these robots I would be impressed. Hopefully you can inspire more young people like me to do more with robotics!
He completely read out 'Thank to the 'DYNAMIXEL XC430-W240-T'' lol. Inspiring stuff :P .
Feels good to be so early to a video.
@@kevinbissinger it's a rush!
James Bruton: fulfilling every man's dream of retrieving a beer from the fridge without leaving your recliner.
You should check out the MIT Kimera semantic SLAM stuff. I just got it running tonight and it’s pretty cool stuff. I would love to see you playing around with this powerful tool.
It’s a collection of programs which can do mapping, localization, in real time as well as label the map by what type of object it is ie floor, table, etc. There have since been improvements and more added functionality but I’m just learning about it now. I definitely think you would appreciate taking a look at it
Is there a ROS node/support?
@@jamesbruton Yup it’s all ros based!
I may not know much about robots, coding, or engineering but I enjoy your videos and seeing how you solve problems. I hope that by watching your videos I can learn even just a little bit about robots over time.
Wow! James, you are an inspiration. So humble, open and impressive. I’m very happy I found your channel. Awesome stuff!
You're building a robot to fetch you a beer, we all know it, and frankly it's what robots are for!!!
How do you know it's just to fetch beer? Have you ever seen that episode of Big Bang Theory with Howard and the robotic arm? 😭
2:55 Story of my life lmao!!!
Fascinating. The right channel to watch when you like to tinker with robots or develop something of actual use. Totally made me subscribe.
Any milage in going full human mode - i.e. visually detect the gripper and rather than trig to theoretically put it where the object is - Visually match the coords.
Thus any error in the Kina model would null out.
Humans don't really 'visually match the coordinates' either; we run on both visual and kinesthetic/vestibular senses to error-correct one another constantly.
Amazing!!...
You can use moveit for the arm which can handle all the interpolation and obstacle avoidance.
Nothing but awesome videos from this channel! I love your stuff!
I just realized... James is a young Rick Sanchez, and this is the butter passing robot 😉 (j/k)
James, I have an idea for a project:. I'm a dj and music producer, and I love all the fun tech there is out there for electronic music.
So, you know going from a traditional piano keyboard to a drum machine is stepping up from 1d to 2d, and adding knobs to a drum machine makes it 3d editing, so to speak?
What if you used digital object recognition...
To create a music device that uses dancers' body to define and edit procedurally generated music?
So instead of an audience dancing to a song, the song creates ITSELF based on the people dancing?
I dunno... It's weird but I think you might get a kick out of the idea... And I don't know ANYTHING about object recognition. So I'm gifting the idea to you if you want it!
Also I'm getting kind of excited to see where your latest dog bot project is going!
Love this idea! I thought along the sane lines during a daydream many years ago... Back then, the technology didn't exist to realize the concept.
Pretty sure this exists fella. Sorry.
@@twobob where?
Even if it did, that doesn't mean someone couldn't try to do it BETTER. James didn't invent ROBOTS, and yet here he is enjoying building his own.
@@twobob ditto on where. That sounds cool, I want to look it up. You can’t leave us hanging like that.
@@ThatGuy-fi9bm Well I have seen several projects along those lines. Some follow the dancers shapes with camera, some react to the various relatives movements of markers (or whatever fiducial ), all you are saying here is convert object recognition into the "fiducial". This is not even remotely hard. Other than lacking a UI for noobs this already seems like a very done thing. Frankly it's just the The Chemical Brothers - Star Guitar video but in reverse.... (Look it up). If you want a device that is probably "ready" to add a UI look at the studio version of Ableton. If you want specific links to some of these aforementioned projects I can look them up. All the research into midi data gloves /pretty much/ covers 90% of the code you need to get this up and running "as your vision". Couple of days project with a Nvidia Jetson
The correct way to map RGB data onto the depth frame is to find the rectification transforms between the two optical systems (RGB and D). OpenCV (required by ROS) makes this incredibly easy, as it has very turn-key method/functions for this. You'll need to print out some kind of target (checkerboard grid or dot grid), and then invoke the required OpenCV methods. It will then spit out a few matrices (including those for lens distortion) that you can use to call a different OpenCV method to rectify the RGB image based on the depth camera's optical properties. The RGB data will then be mapped pixel-to-pixel (or very close to it) with the depth camera.
Hacking static offsets without using any lens distortion coefficients (which an be significant in these Realsense Cameras) is going to cause a lot of problems for you later when you're trying to track and target objects.
“And although Im not a grea software developer i feel i can program well enough to get robots to do what i want”
That's not what he said
@@bigsteve6729 Alright so the joke is I swapped the words "copy and paste" for "programming" because a lot of people say they program (myself included) but use copy and past (me also)
Crazy inspiring!
Subscribed, our students ages 9-15 yr study these concepts and imagine an automated Starship factory; Mecanum wheels and various communication protocols are studied to better understand smart devices. Thank you for your practical research : mobility and automation.
What my family think of me: Robot Genius
Me: copy paste random codes from github.
This is amazing The title really undersells the video.
Does it piss beer into a cup?
He's coding the robodog to do that.
Love it, Amazing!
I like how you're using the weekly UA-cam videos to drive your knowledge of Robotics so that you make use what you've learned for future project - I do this too!
I'm counting on you guys to have robots ready to take care of me in my declining years. Keep up the good work.
You may want to look at *open3d* for pointcloud processing, it has builtin function which can segment objects from pointcould and find centroid from camera refence frame. Also , why not use ros TF info for co-ordinate transformation?
I think its hilarious that the algorithm has a 60-70% confidence that a cup and a pair of jeans is a dog
i like how at 10:50 the robot randomly decides that a cup plus your legs equals a dog... might be helpful to give it some kinda "short term memory" by having it bias the detected object "certainty" in favour of ones it saw in the previous few frames at the same position? like instead of each frame taking the "most likely result", try keeping a record of all the things it thought it might be over the last few frames and the percentage it calculated for its certainty, then average those out and sort for the average certainty to get rid of temporary glitches?
You are so awesome james!
This and the robodog projects are my favorite.
Your skills is awesome 👏 👍
This may well become your most successful project ever. Some ideas:
- Try to remote control your robot while you look at the video of the camera and using the remote control. If you can do useful things like that then in principle a good A.I. should be able to do similar things.
- As you move the end-effector closer to the target, decrease the speed and detect both the target and the end-effector (you can put a special easy-detection marker on it) with the camera. This will allow you to control the last part of the movement much better.
- It might be interesting to have a Pybullet or Gazebo or other model for your robot and environment.
- Think about a battery-charging solution like the vacuum-cleaners it can return by itself to the charger bay.
- It appears there is a small risk that your robot falls over. Don't make the top part too heavy and try to displace heavy elements to the base.
- Think about real applications of your robot such as: help people that have difficulty to walk, have the robot walk around the house or company when you are on vacation, clean, water the plants, check/feed pets, get a beer from the fridge, play games, etc.
Good luck
Can we get playlist of your ai robot videos. And how i can start with cheap machine vision microcobtroller
That was very groovy.
The Purple walker you made looks like a half dismantled GONK power droid from star wars
Really really useful
You could train your model to recognise the robot hand and then find the vector it needs to move in to touch the cup, then you can make it move relative to the cup and don’t bother with the Carthusian coordinates
You’ll probably want to do this anyway when you have a grabbing hand and different sized cups to grab
Good idea. Maybe add some aruco markers on it. Then he will have location and orientation.
There is potential for eliminating the need for encoders on the motors with this method as well
Vectors necessarily consist of Cartesian coordinates (as far as deriving one from the other goes).
AWESOME 👌
Amazing idea
Caught it 42 minutes after upload what a wonderful number
Thank you very much for uploading!
Are you going to put the Omni directional drive system on this project? Seems perfect for it
Not for this one, but it's an option for other robots
Woah so cool
oh cool idea
@james any follow up for this robot ? No manipulator planned ?
Sir can you please tell me which modules did you use in your omni wheel machine please? I didn't quite understand it
hey James I came back to like and comment this Great informative video 12 hrs after downloading.
Edit: I am really promoting ur Channel 🔥 Brother
Amazing
Did you try to detect the arm of the robot with cam? Then you can just make a feedback mechanism for motors to move the arm to under the cup?
Question, could you make a program that is able to remember the layout of the house and can track where you are in the house to come to the area? or would it be too hard
Would it be beneficial if you would use camera position of the arm in a kind of Feedback Loop to detect the delta of the two?
james you are the best
Awesome!
Silly Question (because I haven't finished the video yet) - Have you considered adding an identifier of some sort on the robot arm so you can object rec on the 'hand' to give you some relative information to play with too? A QR code or something as a sticker on the joints for registration like Mo-Cap suits
Yes potentially - it would help a lot with the left-right detection once the arm is in view. I need to make a gripper first though
Wow which things do you use for it I want to make a kind of robot with a arduino it sound really cool to me
Is James a genius or what? I'm blown away
Please never, ever stop using that B-roll of you and the Wolverine claws. Please. It should always be shown in its entirety, and with audio. Thank you.
well damn man. Its really moving along. Have you considered using binocular vision as opposed to these cameras? I have only seen one or two examples. But hat was years ago. I like it cause it's more like human vision and you just need a set of cheep webcams to make it work.
like how for a second the cup was called a toilet XD and later the person turns in to a dog for a second man i love tech stuff like this
small tipp, real sense cameras should publish a topic called "aligned_depth" or smth which is aligned with the color image, so you don't have to do that yourself
Hi James great videos, I am always amazed at what you get up too and build! I have a "Big Ask!" have you ever thought of doing a series of how to build an easier robot / hexapod / or something else for us technical wannabe's or for kids, yes the kids!!?!! Do it for the kids!!!!! Most people have 3D printers and a basic knowledge of Arduino, so if you had to do a series based for "kids" or even the kid at heart, a project broken up into small chunks so there is a better understanding of how it works and how it's programmed, building up to a meduim project when all put together. OK yes there is many tutorials on UA-cam; on Arduino; on programming; on 3D printing; on designing; etc.... But there is only one of you, considering you started before UA-cam and all the robots you have made you have the perfect experience and long term working knowledge. You could even sell a series like that! Just have a think about it, mention this to those who know you and see what they think. Anyway thanks for all the content and sharing your passion with us. Much appreciated.
Hi James, i have robot with ROS.
What should I do if i want to remote my robot with internet?
This is good
Hi, I want to build something similar, but with more objects that I want to manipulate one by one. Is this possible?
Hi James, I was curious if you had any recommendations for adults with kids who want to get into robotics? I know my kids would love it but this is a request more for me instead of my kids. I want something that is pretty complicated but teaches the fundamentals really well. Any ideas? Thanks.
Awesome
Binocular vision!
Wait, I didn't subscribe??
How did I not?
I thought I subscribed cause you are always in recommendation, whaever imma sub
I'm also starting on ROS and isn't it supposed to be able to calculate the transform between the camera frame and the arm for the inverse kinematics?
Yes, if you published the end_effector
why not use the reverse kinematic model to calibrate the position of the camera.
then the arm could be fitted with a easily recognized pattern.
then the arm could increase the precision of the movements by "hand eye coordination"
Have you given any thought in looking into building a robot using RL? I'm still new to all this yet I really want to look into using RL as then the robot will seem to evolve more as it learns..I hope anyway.
Yes I have a project coming up like that
Robot... red cup...
for a moment I forgot which channel I was looking at and thought the robot would be pissing beer into the cup.
got this in recommended cool video you gained a subscribe
thanks!
@@jamesbruton yw
James, you should look at how RC airplane pilots strap their remotes to themselves. I feel like the way you're doing it could lead to some strain
Super
This man about to home brew a robot person
Great !!!!!
James, have you ever experimented with virtual biology, such as computer-simulated bodies, bio-inspired robotic functions, or other such things? I just have this hunch about creating instinct-driven AI in part through simulated bodily function.
Bring back the 914 PC Bot!
0:40 Gonk droid from star wars xd
10:44 robot is like dude that's a cup robot brain no is a dog well then is both
I want a robot that will cook me cuisine for breakfast, lunch, and dinner. Just biding my time and i cant wait! Haha
Does anyone know how to take what he did in this video and make the robot follow a person?
If(probability (person) >96){
Kill();
}
Oops! Syntax error, i ment to use ' Skill()'😱
Wooooo ✨✨
Ugh i want that nvidia jetson!
Hi i love your videos keep up the omase work
Btw i used the name of the sponser in my comeng
You're a Brainiac. In many quarters (including those I inhabit) that's among the greatest of compliments.
10:45 when your legs are dog
Copy and Paste ... yessshhh.. you are not alone.
Why the software doesn't remember the object, but only recognize it? For example the cup, sometimes il less than 50% shure that it's a cup even if 1 second before it was shure at 80%. If the object didn't leave the field of view it should be the same as before
That's interesting - I guess it's down the pre-trained model that detects any cup, I could retrain it for this specific cup and it would be a higher % certainty. Ultimately it's doing the detection on every loop from scratch so it will vary as I move the cup around.
Good job❤🇸🇳🇸🇳🇸🇳❤❤❤🇸🇳🇸🇳
The former 400 dollar Xavier is now being msrp agonstic aka (enabled scalping) by nvidia for over 1000 and the jetson that used to be 150cad is now 4 to 800 depending where you buy it. I can't imagine how much the ampere/hopper ones will be given they've eol'd the Tegra x1 based on maxwell which the jetson b01 runs on. Nvidia is catering to higher end customers. It sucks trying to develop a program as a volunteer teacher
💙
swag
Reminds me about Michael Reeves video 👀
I like how no one controls your robots. 🙂🙂🙂🙂🙂
He made the GONK droid