How any Robot can Navigate with this Module
Вставка
- Опубліковано 20 сер 2024
- Get your first 10 PCBs for free at www.pcbway.com/
This is part 2 of experimenting with an Intel Realsense T265 Tracking Camera to replace the TF and Odom topics produced by wheel odometry, so use Gmapping and the Navigation stack in ROS. it seems to work pretty well so this has a lot of potential to just drop into any robot and make it navigate autonomously. I'm running the wheels totally open loop having removed the wheel encoders from the robot, so all positioning is carried out by the tracking camera.
Code, CAD and details: github.com/XRo...
You can support me on Patreon or buy my Merchandise:
***************************
Patreon: / xrobots
Merchandise: teespring.com/...
***************************
Affiliate links - I will get some money of you use them to sign up or buy something:
***************************
Matterhackers 3D printing supplies: www.matterhacke...?aff=7500
Music for your UA-cam videos: share.epidemics...
***************************
Other socials:
***************************
Instagram: / xrobotsuk
Facebook: / xrobotsuk
Twitter: / xrobotsuk
***************************
CAD and Code for my projects: github.com/XRo...
Huge thanks to my Patrons, without whom my standard of living would drastically decline. Like, inside out-Farm Foods bag decline. Plus a very special shoutout to Lulzbot, Inc who keep me in LulzBot 3D printers and support me via Patreon.
HARDWARE/SOFTWARE
Below you can also find a lot of the typical tools, equipment and supplies used in my projects:
Filament from: www.3dfuel.com/
Lulzbot 3D Printers: bit.ly/2Sj6nil
Lincoln Electric Welder: bit.ly/2Rqhqos
CNC Router: bit.ly/2QdsNjt
Ryobi Tools: bit.ly/2RhArcD
Axminster Micro Lathe: bit.ly/2Sj6eeN
3D Printer Filament: bit.ly/2PdcdUu
Soldering Iron: bit.ly/2DrNWDR
Vectric CNC Software: bit.ly/2zxpZqv
Why not join my community, who are mostly made up of actual geniuses. There’s a Facebook group and everything: / 287089964833488
XROBOTS
Former toy designer, current UA-cam maker and general robotics, electrical and mechanical engineer, I’m a fan of doing it yourself and innovation by trial and error. My channel is where I share some of my useful and not-so-useful inventions, designs and maker advice. Iron Man is my go-to cosplay, and 3D printing can solve most issues - broken bolts, missing parts, world hunger, you name it.
XRobots is the community around my content where you can get in touch, share tips and advice, and more build FAQs, schematics and designs are also available.
We need more people like this
@JestEr yep
@JestEr true dude.
demand education reform then. half these schools are still using common core material
This is why we will die to Skynet..
@@kingmasterlord yeah finally someone that agrees with me. Everyone should be teaches differently with more efficiency. I told this to my teacher and got suspended lol. Government come on do something
cool, really useful to have a robot agnostic navigation system. I imagine if you ever revisit making a biped, this along with the face tracking nerf turret would lay the foundations for our first robot overlords.
This is so true I'm 100% sure anyone who saw this comment wants to be the one who said this like me.
I feel you but the Boston Dynamics may already have a few foundations in mind...
Very clever. I didn't think such accurate mapping was possible without wheel odometry.
as always, excellent video James! 👌😎 amazing how much work it went into this
I didn't think I would see you here, but then again it makes sense
I'm so happy you picked up ROS along the way :)
He is smart, his brain must have so many wrinkles
4 at least
Nuh uh einstein only had 3 folds this got has 2
Really great job, Now my robot navigates without wheel odometry perfectly.
Great content, so valuable. Maybe we all own a robot in the future thanks to initiative like yours. If not, robots will only belong to the big corporations like today (yes, an ATM is already a robot). The implications are HUGE
Really!!!! James and Collin building an autonomous tank with a cannon! This is how we get robot world domination people!!!
"The 3 Laws of robotics prevent a robot from harming a human."
*only programs 2 laws into robot*
Love it. FWIW, the reason most people keep the LIDAR low is for eye-safety
Was that a hint at another Colin Furze Collaboration?
Colin Furze controlling an autonomous robot that James will build in the future: At my signal, unleash hell !
If you'd make an online course about robotics or 3d desgin i'd totally buy it!
I'd pay good money for it too
That is sooo cool! I'm betting he'll add it to a skateboard too. I'd love to see that in one of the Star Wars robots!
Hello James !
Thank for this sharing
Suggestion: it may be better to place the sensors near the centre of rotation, particularly on OpenDog where you maybe remake the front panel to have a hole for them. This way there will be less noise as they won't move as much as on the top of the robot. You will lose some of the rear FOV but that can be fixed by adding more or just rotating the robot itself like a real dog would have to. Hope this helps!
Really good video. My FRC team is looking at implementing this exact Lidar unit into our robot this year.
the camera already looks like eyes and the lidar resembles a tophat ... maybe that could be a nice theme for the "module"
Fantastic engineering detail, Thanks James.
I suggested this change on the last video, good thing it worked
Nice progress ! For robot without wheel encoders, I recommend hector packages and robot_localization. You will get a bette steady odom :)
James, if you are using the RealSense camera, that laser is an overkill. You can get the distance to the walls/obstacles simply from the depth map produced by the camera if you don't have a SLAM mapping module in ROS that can use the camera directly.
That will save you also quite a bit of current (those LIDARs are pretty power hungry).
I wish the Lidar module used in the new iPhone/iPad could be used independently for projects like this.
But where would we put the meter? /tesla dies broke
Nice build! The laser module looks kind of funny though, maybe having a rotating mirror would be better
Don't forget to write a nice urdf for your robots 😀 . Nice work
Yes, it's mostly proof of concept for now.
that Nerf turret is exactly the same technology as Iron Man's shoulder guns that shoot hostage-taking terrorists
I can't belive this information is free :D
You are a great engineer!
What about building the Lidar onto a platform with an angel, so as it moves, you can scan the whole enviorment in 3D. Do you think, you could get this to work? :)
i would be interested to see how a robot performs with a neural network for a brain. where you simply give it a mission (for example go to this place), attatch a laser to it and then let it learn to move around.
I never realized ROS is that useful before
Very good work
You should look into using some sort of graph SLAM with the LIDAR. I didn't realize that all of your position data came from the wheel encoders, as those lose tracking over time, especially at higher speeds/accelerations.
You should probably put the lidar on a servo or stepper motor so that you can keep the lidar facing one way the whole time when it turns. I noticed that when you turn it messes with the whole map because the lidar isn’t turning the amount it’s programmed to turn it could be you a little faster or slower. If you put it on a servo or motor though you could get rid of that problem.
we do like your kitchen. awesome video
I still think it needs to be a on a gimble that self levels. If you can get it working without one, maybe ad in a head that can look around so the robot can get more data about the environment?
Can you put it on a drone please! Would be awesome to see what it's capable of like navigating stairs and if combining with other drone sensors like barometer, optical flow and ultrasound will make it a beast!
Thought this could be ideal for the little ROS project I'm building but it looks like the T256 is discontinued and there's no new model coming out. Such a shame.
Very nicely done!
very beautyfull work!! I notice You don't use belt anymore for transimssion...good choise...belt will broke after few years eather you use the robot or not....gears are more strong and will broke after more time ..and only if were used.... will be great it you will use gears (3d printed is better for makers XD) in all your future project...
You should make a robot that both navigates your house and shoots everyone it sees so you just have an assassin robot trying to shoot everyone in the house
can´t you build a modul under the dog that can act as legs were the robot can rest on. I mean that the legs can extend or retract when the dog is supposed to stand still. via servo or geared motor. sorry for my terrible englisch, im from germany.
Love you open your sources code
hello from Qazaqstan
Excellent job any opportunity to learn from you
I will be honest, if I hadn't used ROS myself before, I might have had trouble following what exactly you were doing in this video.
Flat head screws.....WHY!!!
Because I have some
Your amcl seems to have an uncertainty that is way too big. I think this is probaly causing the ghosting, because the laser scans are recorded in the wrong postions by the local costmap. Try to tune the amcl to rely more on the odometry by turning down the `odom_alpha` parameters. Especially `odom_alpha1`.
And you must set `odom_model_type` to `omni-corrected`
Can this scale up to my lawn tractor? I have a 1 Acre lot with lots of bushes and trees to avoid.
Next thing you'll know he'll have his whole house operated by robots
James, I think you underestimate the quality of your leg odometry and the sensor fusion algorithm the comes with the T265. Each joint is directly coupled with an econder you can read and all the wobble will be picked up by the encoder. The only issue would be if the leg slips on the floor but that issue is still present on wheeled robots.
Do you have a link to the joysticks that you used for the controller. It would really help me with my robot
Please do a project using oak-D
Great video!
Good video
Your laser scan seems to distort when the robot is rotated rapidly. I wonder if that comes from a synchronization/latency issue between the tracking camera updates vs the laser scans. I'd be interested to know what's going on if you figure it out.
Also, why flat head screws?!
I probably drove too fast when I did the map, the laser is the cheapest available also. The screws are countersunk?
can you use a laser mouse set up for track on the big dog to calculate it position
That is pretty neat! Any ideas of where I can find code to run an ROS node on a 65C02? I have an old Maxx Steele robot from the 80's that would make good use of the platform!
Also, your "target" head looks like Ensign Ro from Star Trek The Next Generation.
Now build a hand and drop that on there. ThingBot.
Would using a gimbal (or 2) (DIY or purchased) help for stability for the camera & sensors, on the dog!?
That was just my immediate thought to maintain stability and to try and curb/minimise other issues like sensor drift? All theoretical, of course, but still worthy of trying if resources are available.
Hey with a 3d camera like the intel T265 that you have you should be able to get your map data without the additional lidar
If the Camera doesnt give you the depth info into ROS on its own you should be able to get them with the raw Footage of the Camera with a little help from openCV
The T265 is tracking only. I have the Depth camera, but the LIDAR is 360'.
So cool
you might try using ultrasound in conjunction with the lidar?
Ultrasonic is pretty bad in comperision to lidar.
Ultrasonic cant detected soft stuff like pillows or walls in an angle do to reflexion of the echo, same with round object like poles. The messearing is pretty slow do to the spead of sound and using multiple sensor is difficult.
@@jonasstahl9826 might give a useful second opinion on the lidar ghost images. ultrasound is quite good at showing where things aren't
Would using nylon fasteners be an option to reduce weight? Or are they too weak? Another option would be to machine fasteners out of aluminium
I wish Lidar was more accessible.
Very Good
In the year 2020, James robots achieved self awarene.....
Apologies for the Rowan Atkinson death joke.
Super!
Amazing work James !
From the video I guess you didn't install Jetson Nano Developer kit image on your Jetson Nano instead you used a normal Ubuntu , is this right ? and Is there is a reason for that?
Best of luck with your upcoming projects :)
a cool implimentation/toy using this would be a RC car that you can drive that would sense if you were going to crash and prevent it. either by steering away from walls or stopping the car. would make a fun tesla autodriving rc car :P
Hello Nice video, What version of ros do you use?
With this project, is it possible to make the T265 follow a human? If so, how?
Canyou made it smaller?
Awesome 👌
Working backwards - could you use the movement data from the RealSense to help stabilise the dog?
One question in a video you published 5 weeks ago you said you just use codes from rebuild examples. Isn't that boring af to just copy and paste
I mean the spirit of tinkering is building it your self and explorer new areas. That makes what makes it fun. I know you can't build anything your self but I think just relying on already finished projects with we'll define documentation is like let the robot be built by someone else. There is no part of yourself in it.
Don't understand me wrong I know you put probably many hours of work into it. But don't try thinks and maybe failed at new thinks nobody has done before is not a bad thing because if hat no one that failed it the first place at a new topic there would be no evolution bc everyone is just relying on the thing that were made in the past by someone else.
Amazing😱😱😱❤️
What lidar sensor did you use?
James, so can you make the same map with a D415?
Why not do real Steel atom robots??
Amazing project, Im new to ROS and I have a quick question, how do you visualize rviz remotely?
you just need to set your environmental variables for ROS_MASTER_URI, ROS_IP and ROS_HOSTNAME to the instance that's running roscore - checkout the networking section of the ROS docs for more details.
i sure hope your mannequin isnt named Sarah Conner
I don't really understand the need for the Arduino; the Jetson nano has enough GPIO pins to control the motors and read the encoders. Is there some latency issue that makes it better to use the Arduino or are you just more comfortable having it control the locomotion?
I know how to do it this way, plus it can be bolted onto any Arduino robot with differential drive.
@@jamesbruton I see, makes sense. I'm planning my own autonomous robot build with the Jetson Nano right now, so I was curious. I suppose there could be an added benefit of not having motors directly controlled by scheduled processes too.
what programm do you use to make the data visible on pc and can i use these datas on a rasberry pi to navigate??
Cool!
How does one get into robotics. any online courses you'd recommend?
11/10 not mad just dissipointed
self-excited auto-mapping, "getting to know"
you could do automatic, or scheduled on detected objects, like (waterproof) watering some plants, if found and state recorded
why does that sound like an old ribbon printer
spray watering might be the best
show !!!
Nice, first 350 xD
I did the 101 lkke
Attach it to that mantis robot :)
It would probably work, although it would need to be somewhere with a 360 view without the legs in the way.
@@jamesbruton i don't think I'm ready for that kind of scary haha
Imagine just seeing that patrolling the streets.
What books do you reed to understand arduino
The Adafruit Arduino tutorials are pretty good.
Thanks James
road to 100 trillion subs 😀😀😉😉😉
Does he sleeps sometime?
SENSE I'm Study pleease
intel L515 = no moving parts Lidar.
Hello
👍✨