My only problem with this bot is that I think it isn't able to differentiate between moving (usually organic) and unmoving (usually a wall, by example) target. So if you just boot it up close enough to a wall, it will just shoot at it.
What crazy timing! Just yesterday I was thinking about a Lidar project and I found exactly this module! My project idea was a "sixth sense" accessory. The Lidar would be mounted on a backpack (the one I use is rigid), with some buzzers and an IMU. The lidar data would be analyzed for moving objects, using the IMU to account for the wearer's motion and pick out objects (people) that are approaching the wearer from behind. If they got too close or approached too quickly, the buzzers would trigger with proportional power to indicate the approcher's direction and speed. I play a lot of HvZ, a nerf game where enemies can approach from anywhere, and this was my best idea to have a gadget to watch my back.
Why dont you just use an ultrasonic sensor they dont have too much range but it could face upwards a little so your movement wouldnt be as critical and you could use a vibrating motor taped to your arm that would be more sneaky than a buzzer
@@timotholen7255 A belt of vibration motors seems to work well. It's been used in some situations to help helicopter pilots 'feel' obstacles low to the ground, since a helicopter can often kick up a lot of dust that blocks visibility...
I wrote code for lidar mapping like this before, except that the sensor I used did more than a single plane of detection, having a vertical array of lasers that spun around to create a "fan" of detection. It cost like 10k though, so I'm really happy to see cheaper sensors on the market.
You're a bit late there, multiple military forces have already been experimenting on autonomous seek and destroy robots for years, and they're making rapid progress.
This is great! Now if you can get the lidar room mapping working you'll get two advantages: 1. You can correlate the room map with the robot position to give it better pointing and positioning control. 2. After waiting for a room map to emerge, you can then detect changes as potential targets.
I knew what was coming, but the robot turning around to shoot something behind it was still unsettling! This is awesome. Might have to look at building one of these for use at my robotics team's outreach events!
i just found you on youtube and let me say you are incredibly smart, great in 3d printing and fun. I am going to check your other videos. It would be nice if you could think at a new project like a robotic lamp arm.
also the PWM pin is because the lidar will send uart data of the speed so you can do closed loop speed control of the lidar rotation which stabilizes the speed, makes the data a lot more reliable
I had seen a video from Andreas Spiess titled "#227 The ugly truth about LIDARs for obstacle avoidance" where is explains a bit. This is a nice addition and update to it for us folks working on obstacle avoidance. A robot as proof of concept for sensors is always the most fun!!!. Thanks
Imagine walking into his room, and everything is quiet, except for the gentle whirring of a strange orange device, which appears to be turned on. You walk closer to examine it and suddenly, it turns around. Very awesome and scary.
I have taken about a year off your channel but glad I came back at this point. I would have these all around the house but it really does need the ED-209 voice from Robocop
FYI the quality value is the reflectivity of the surface, not the accuracy of the reading. Quality is a term in optics that refers to the ratio of reflected light to the incident light.
Ou I know that this can be a real fun project! James, you should make it closed loop feedback and have the front of the robot track the target, using the lidar as the sensor to determine how much further the robot needs to turn to lock onto the target. That would make it perform way better than 36 different encoder count set points hard coded. Then you can include a PID to make the rotation happen sharply and swiftly! Or you can even have it estimate the “time till target acquired” and prepare arming sequence on route as it turns to face the target so that it can basically quick-draw. Releasing the bullets as soon as it has faced the target.
I thought the delay in shooting is because the robot is waiting for the results of averaging. But it's very fast to move and then slow to fire. What's causing the delay in shooting?
There is a delay set so the blaster motors power up properly before it pulls the trigger, and then the trigger needs to move. If the motors ran continuously then it could be quicker.
This is a very cool and fun project! Great implementation. Only should have the gun motors running continuously, so the reaction time would be way less. Thanks for sharing this cool project.
One way of working out if something is further or closer based on "o" is to record the highest value that isn't 0, then when you check for 0, if it's a high number then you've gone over 7, and if it's a low number you're too close to something.. Not that it matters, building the robot so it can't ever be too close is a great idea :D
interestingly the sensor doesn't use time of flight, instead has the laser at an angle and a camera to detect the position of the laser spot, then uses trig to find a distance, reducing the complexity and cost dramatically, as no extremely fast and accurate timing is needed.
Amazing build James. Love this Nerf robot. If you gave it tank tracks and let it loose outside you could have a blast with it. Give it a target that you can shoot at for a Human VS Robot Battle!
You should do a circle of about 8-10 guns and have the the same sensor at the top and have which ever the sensor picks up, the gun nearest that target shoots :) great robots!! Never miss your videos!!
This is a good start, but having only 12 shots between manual reloads is quite the limitation. Have you considered using a hopper-fed blaster, such as an Adventure Force Commandfire or a Dart Zone Destructor? This could dramatically increase ammo capacity and would have the incidental advantage of making the blaster easier to control, as these blasters come with a fully motorized firing mechanism.
Just a thought for V2. You could perhaps only use 1 blaster at a time, but leave it running for 10-30 second blocks and than swap to the other blaster. Leaving it running in cycles will hopefully be enough to not cause issues with the blaster motors whilst also allowing for the detection to shoot time to be largely reduced. Can't wait to see more of this project, I think might be the push I need to finally take back up coding again👍
@@jamesbruton Awesome! Really huge fan btw, you have been an incredible inspiration for me; started building robots and implementing better design techniques/practices thanks to you!
The company I work for makes industrial LIDAR scanners, fun to play with. Usually they are used for collision advoidance, although we have a measurement versiopn also.
Hm, no need for a compass to do that... The wheels can simply turn the robot to the angle where LIDAR reports the target as being. The LIDAR isn't using earth's magnetic field for reference, it's using its chassis.
I think your printing of the distance values slowed down the measurements significantly - that's why the measurements looked so random? If you just keep the closest measured angle and only print that every rotation, things may look less messy?
I think James really enjoyed this one, I swear you could see him fighting to keep the grins away hahahah good stuff great content been following since R2-D2
From a software engineer perspective, you can just say that you store the initial time and measure every loop iteration the time elapsed to not have to block/delay the code. Also you've kind of built a state machine where the flag just keeps track of the state.
You are the best and the fastest in the entire world as of today when it comes to building new robotics projects. This definately can be sold commercially, imagine killing all the pests in your fields suchba locusts taking over large villages and cities, it is like advanced slingshots used by farmers and defense systems. There is a lot covered in this project which takes years of experience and hardwork
I presume the start bit means if the scanned results belong to new rotation, I guess they want you to move the data into array using the start bit to determine if you should create new object for the array...
I was hoping that it would actually roam around with a map of the house built in and, if it detects a large change in the map (aka, someone goes in the robot's path) it will turn towards the target and fire. Maybe you can implement something like that one day, though the lidar can't really see that far. Maybe you can figure it out.
Very good video, I have an old defective robo vacuum cleaner, I didn't know what to do with it, I'm going to use the included lidar to create a project! Now I need to see which model is in my Ecovac R95 :)
You could make the thing load itself with ammo rather easily if you had a oscillating turret, basically the same horizontal slewing mechanism on the bottom, then the rest of the thing is encased in a turret with no moving parts, the turret itself aims at stuff vertically through gears to shoot down or up, this allows for a simple autoloader to be attached to the guns, without the complications of having to move the gun to the autoloader position every time it has to reload.
@James Bruton. Fantastic Project. I'm doing a simplified version of the project to just control the chassis with encoder position. Why not use the LIDAR data to rotate until the object detected is within the plus minus 5 degree window about the zero mark? Why use encoder, when the LIDAR data could be a more reliable closed loop system? Does my question make sense?
That’s cool, respect for your skills, seriously! Only thing is, your robot is only moving around his own axis. It’s basically a turret. When it would drive, it would shoot on anything coming in range wouldn’t it. I don’t think lidar is the right choice for the intended purpose of your robot. The idea at the end with face recognition would make more sense, then the robot could freely move without being stationary and shoot on things in proximity. Still, awesome project!
Please can you explain how you manage to design, develop, print, build and refine such large projects every week? I'm dumbfounded you can manage every week. You must have a large team and multiple 3d printers
When you have it roaming around seeking to destroy, add some targets to it which temporarily disable it. Then when the pandemic is over, take it to that warehouse, set up some obstacles (cardboard boxes etc.) and invite people to play capture the flag against it. They can try to disable it with their own nerf shots whilst it seeks them out. I'd pay to play that 😂
I was wondering if you can use it to detect movement... But a movement detector is just a pretty basic version of this. Can you set how hard it fires, depending on the distance to the target? Also, does it need another Lidar for its movement, avoiding obstacles, and detecting moving targets? Or can it be done with just one, but with a better software? Depending on all of that, it can be a very cool toy! Add a controller, a camera with VR support, maybe a speaker + mic, and you can sell it as a toy. I'd buy some! :) ;)
To improve precision you might need to consider SLAM algorithms. Plan detection and intersection, associated with some IMU that corrects your odometry, can help. Otherwise for less than 180 euros (range price of this simple lidar) you can get a quite precise T265 from intel. It provides you with a good translation vector and rotation quaternion for every move (200 fps). Very handy. It works with a jetson nano.
It wouldn't be too much more work to code a target lock function. I think you'd have better luck with thermal vision over a regular camera though - simpler input and easier to quantify shapes.
Does the LIDAR work only on one level or you can also Tilt it? I think you should use some powerfull SW to make use of the full LIDAR, to be able to build a map of the surrounding that is continually updated, otherwise the Robot is quite dumb and the power of the LIDAR is not used to the full. Now you are using the LIDAR as it was a directional ultrasound. Also a rotating platform independent for the guns would be much cooler. Nevertheless amazing project and also inspirational!
If you use the Jetson board its not just ROS youl be interested in, you might do some investigation into TensorFlow and Keras. ROS and the Machine Learning software support Python which is great for prototyping but youl want to move these over to C++ ultimately to get the best responses and accuracy.
Ever had weird glitches? Tip, turn off interrupts if you do anything with interrupt variables. R/W is not atomic aka they can (and Murphy tells us they will) change halve way trough a R/W. And yeah, arrays xD
Hey James, would it be possible to use the lidar to do 3D scans of objects, rather than mapping a room? I'm envisioning a vertical axis that the lidar would move up and down on, coupled with a rotating platform to put the object on. So the scanner would start at the top or bottom of the vertical axis, the object would do a full 360 rotation while the lidar is scanning, then the scanner would move 1 increment and start the cycle again.
great video, love your channel. Would love to see parts lists in your comment section, So for example a link to buy the motors or the lidar. thanks heaps.
James, What is your preferred infill and filament material for your 3D prints? I'd like to try prototyping parts like you do for the most structurally sound framework.
You should attempt to make the closest equivalent to the “remote control Ironman armor” in Ironman 3. Make two suit. One that’s almost a eco skeleton that can “sence your movement”. Then the second suit that’s a full robot with the full range of motion as a human. So the “robot suit” move how you move when you wear the “exo skeleton” that’ll be awesome dude!!
James, do you understand that the 'serial.print' command is very slow and shouldn't be used if high speed timing is required. Take all serial commands out of the loop that reads or uses the Lidar. That's why you got such messy output on your original test script. Another good example of what NOT to do.
Are you able to poll the lidar sensor in a timer interrupt or does the library support interrupts? It would likely help you get more consistent values if the code reading your sensor was deterministic.
Patrons and UA-cam Channel Members already have next week's video - it's part 3 of openDog V2 - building the entire mechanical assembly!
Gun dog or turret from rust.
The lights even turn green to let everyone know when it runs out of ammo, _just like a real sentry robot!_ 😜
Send this to me pls
My only problem with this bot is that I think it isn't able to differentiate between moving (usually organic) and unmoving (usually a wall, by example) target.
So if you just boot it up close enough to a wall, it will just shoot at it.
Would be interesting to see the robot use trigonometry to converge the guns on a fixed distance
A small offset between points of aim is probably a good thing in case you miss.
let me introduce you to gravity and the boiz
That would be very interesting, essentially making it a sentry turret.
Adjusting the angle of elevation depending on distance would be more useful.
Just needs more dakka instead.
It should say “Are you there?” When a target disappears
Or the portal turret "are you still there"
@@crumblesilkskin It would need laser sights also ;)
"must have been my imagination"
alternative: "Must have been the wind"
"I'm different"
What crazy timing! Just yesterday I was thinking about a Lidar project and I found exactly this module!
My project idea was a "sixth sense" accessory. The Lidar would be mounted on a backpack (the one I use is rigid), with some buzzers and an IMU. The lidar data would be analyzed for moving objects, using the IMU to account for the wearer's motion and pick out objects (people) that are approaching the wearer from behind. If they got too close or approached too quickly, the buzzers would trigger with proportional power to indicate the approcher's direction and speed.
I play a lot of HvZ, a nerf game where enemies can approach from anywhere, and this was my best idea to have a gadget to watch my back.
Sounds fun but very very hard to get working practically if you are not perfectly still.
How about for social distancing
Why dont you just use an ultrasonic sensor they dont have too much range but it could face upwards a little so your movement wouldnt be as critical and you could use a vibrating motor taped to your arm that would be more sneaky than a buzzer
@@timotholen7255 A belt of vibration motors seems to work well.
It's been used in some situations to help helicopter pilots 'feel' obstacles low to the ground, since a helicopter can often kick up a lot of dust that blocks visibility...
@@timotholen7255 Ultrasonic sensors are horrible when moving because they don't pick stuff up in all directions.
You are like Michael Reeves but in well organized.
lawful neutral michael reeves
This is the funniest thing I've read here lol.
Michael Reeves, but not a sadist
I wrote code for lidar mapping like this before, except that the sensor I used did more than a single plane of detection, having a vertical array of lasers that spun around to create a "fan" of detection. It cost like 10k though, so I'm really happy to see cheaper sensors on the market.
He’s literally teaching them how to seek and destroy. We’re doomed.
You're a bit late there, multiple military forces have already been experimenting on autonomous seek and destroy robots for years, and they're making rapid progress.
@@PunakiviAddikti well then.. we're fucked
Imagine being a kid having James as your uncle. :D
I love this. Can you add a hit box that players could shoot to temporarily "stun" the robot during games.
Good idea
This is great! Now if you can get the lidar room mapping working you'll get two advantages:
1. You can correlate the room map with the robot position to give it better pointing and positioning control.
2. After waiting for a room map to emerge, you can then detect changes as potential targets.
I knew what was coming, but the robot turning around to shoot something behind it was still unsettling! This is awesome. Might have to look at building one of these for use at my robotics team's outreach events!
You have a Xenomorph costume, you made an automated sentry gun... you made a Ghostbusters reference.
Well played.
light is clean, trap is clean.
You can tell he's excited about his project by the tone of his voice and the expression on his face.
i just found you on youtube and let me say you are incredibly smart, great in 3d printing and fun. I am going to check your other videos. It would be nice if you could think at a new project like a robotic lamp arm.
also the PWM pin is because the lidar will send uart data of the speed so you can do closed loop speed control of the lidar rotation which stabilizes the speed, makes the data a lot more reliable
I had seen a video from Andreas Spiess titled "#227 The ugly truth about LIDARs for obstacle avoidance" where is explains a bit. This is a nice addition and update to it for us folks working on obstacle avoidance. A robot as proof of concept for sensors is always the most fun!!!. Thanks
This was a cool build. Now I feel like I need a 360 lidar to mess around with too.
thanks, hoping to do more with it in the future
Having that thing spin around and face you like that is down right frightening! Good home security!
Imagine walking into his room, and everything is quiet, except for the gentle whirring of a strange orange device, which appears to be turned on. You walk closer to examine it and suddenly, it turns around.
Very awesome and scary.
I have taken about a year off your channel but glad I came back at this point. I would have these all around the house but it really does need the ED-209 voice from Robocop
the robot is definitely fantastic but what i loved the most was the amazing suspension mechanism
The perfect solution for defending your home from mannequins with long hair.
This bad boy works on _all_ mannequins. 😎
James that was awesome. Super fast rotation.
FYI the quality value is the reflectivity of the surface, not the accuracy of the reading. Quality is a term in optics that refers to the ratio of reflected light to the incident light.
Cannot like this enough. I have an A1 in my shopping cart and I'm about to pull the trigger on buying it!
Can't wait to see this unit on your dog! 🤖
Ou I know that this can be a real fun project! James, you should make it closed loop feedback and have the front of the robot track the target, using the lidar as the sensor to determine how much further the robot needs to turn to lock onto the target. That would make it perform way better than 36 different encoder count set points hard coded. Then you can include a PID to make the rotation happen sharply and swiftly! Or you can even have it estimate the “time till target acquired” and prepare arming sequence on route as it turns to face the target so that it can basically quick-draw. Releasing the bullets as soon as it has faced the target.
I thought the delay in shooting is because the robot is waiting for the results of averaging. But it's very fast to move and then slow to fire. What's causing the delay in shooting?
There is a delay set so the blaster motors power up properly before it pulls the trigger, and then the trigger needs to move. If the motors ran continuously then it could be quicker.
Its reving up
If a nerf gun isnt given enough time to rev up the motors, it can cause the the gun to jam
@@jamesbruton or just rev up the motors while it's moving??
This is a very cool and fun project! Great implementation. Only should have the gun motors running continuously, so the reaction time would be way less. Thanks for sharing this cool project.
One way of working out if something is further or closer based on "o" is to record the highest value that isn't 0, then when you check for 0, if it's a high number then you've gone over 7, and if it's a low number you're too close to something.. Not that it matters, building the robot so it can't ever be too close is a great idea :D
interestingly the sensor doesn't use time of flight, instead has the laser at an angle and a camera to detect the position of the laser spot, then uses trig to find a distance, reducing the complexity and cost dramatically, as no extremely fast and accurate timing is needed.
This is brilliant, soon robots will be pinning whole citys down. Caught the ghost busters reference
Amazing build James.
Love this Nerf robot.
If you gave it tank tracks and let it loose outside you could have a blast with it.
Give it a target that you can shoot at for a Human VS Robot Battle!
On today's episode of "finishing projects that normally take months of work in a single week"
good project friends, greetings of LATINO-AMERICA!!!!
You should do a circle of about 8-10 guns and have the the same sensor at the top and have which ever the sensor picks up, the gun nearest that target shoots :) great robots!! Never miss your videos!!
i was thinking "cute robot with nerf guns...." then the first time it scans and turns i prayed for nobody making this with real guns :P
This is a good start, but having only 12 shots between manual reloads is quite the limitation. Have you considered using a hopper-fed blaster, such as an Adventure Force Commandfire or a Dart Zone Destructor? This could dramatically increase ammo capacity and would have the incidental advantage of making the blaster easier to control, as these blasters come with a fully motorized firing mechanism.
Just a thought for V2. You could perhaps only use 1 blaster at a time, but leave it running for 10-30 second blocks and than swap to the other blaster. Leaving it running in cycles will hopefully be enough to not cause issues with the blaster motors whilst also allowing for the detection to shoot time to be largely reduced. Can't wait to see more of this project, I think might be the push I need to finally take back up coding again👍
Definite shades of Ed209. "You have 20 seconds to comply"
I cannot wait for James to start using ROS!! It would be so cool to see his robots integrate SLAM and other frameworks like MoveIt!
It's coming
@@jamesbruton Awesome! Really huge fan btw, you have been an incredible inspiration for me; started building robots and implementing better design techniques/practices thanks to you!
Love this music for the 3D-printing!
James shooting Onisions head - priceless!!!
The company I work for makes industrial LIDAR scanners, fun to play with. Usually they are used for collision advoidance, although we have a measurement versiopn also.
Love it, love it all. Both educational and entertaining. Keep up the awesome channel.
There are bad nerds, and there are good nerds but you are the best nerd
you should try using an electronic compass to directly match the LIDARs angle.
Hm, no need for a compass to do that... The wheels can simply turn the robot to the angle where LIDAR reports the target as being. The LIDAR isn't using earth's magnetic field for reference, it's using its chassis.
This is the most advanced nerf gun robot I have ever seen, but you made it sound understandable 😁👌
I think your printing of the distance values slowed down the measurements significantly - that's why the measurements looked so random? If you just keep the closest measured angle and only print that every rotation, things may look less messy?
I think James really enjoyed this one, I swear you could see him fighting to keep the grins away hahahah good stuff great content been following since R2-D2
Actually, this is the first robot itself that I see on this channel
Great ingenuity ! Well done !
From a software engineer perspective, you can just say that you store the initial time and measure every loop iteration the time elapsed to not have to block/delay the code. Also you've kind of built a state machine where the flag just keeps track of the state.
He likes to explain "for dummies" because he can't guarantee that every new viewer is a software engineer.
You are the best and the fastest in the entire world as of today when it comes to building new robotics projects. This definately can be sold commercially, imagine killing all the pests in your fields suchba locusts taking over large villages and cities, it is like advanced slingshots used by farmers and defense systems. There is a lot covered in this project which takes years of experience and hardwork
Awesome project, looking forward to more like this.
I presume the start bit means if the scanned results belong to new rotation, I guess they want you to move the data into array using the start bit to determine if you should create new object for the array...
I was hoping that it would actually roam around with a map of the house built in and, if it detects a large change in the map (aka, someone goes in the robot's path) it will turn towards the target and fire. Maybe you can implement something like that one day, though the lidar can't really see that far. Maybe you can figure it out.
if you run up the nerfs and do the truning after that, the nerf shoots will go of faster. it hase the time it takes to turn. :) nice project!!
Very good video, I have an old defective robo vacuum cleaner, I didn't know what to do with it, I'm going to use the included lidar to create a project! Now I need to see which model is in my Ecovac R95 :)
I really need this, love your work
Did almost the same at university last year (exactly the same nerfgun) ... But we used 3 ultrasonic sensors to locate targets in 3d
So you have essentially made a sentry droid? Mate, you need to make it look like something out of star wars, that thing is frigging cool
you could also use a maixduino for recognize objects and you can buy it for about us$28 with camera and display. It is arduino compatible board.
You could make the thing load itself with ammo rather easily if you had a oscillating turret, basically the same horizontal slewing mechanism on the bottom, then the rest of the thing is encased in a turret with no moving parts, the turret itself aims at stuff vertically through gears to shoot down or up, this allows for a simple autoloader to be attached to the guns, without the complications of having to move the gun to the autoloader position every time it has to reload.
@James Bruton. Fantastic Project. I'm doing a simplified version of the project to just control the chassis with encoder position. Why not use the LIDAR data to rotate until the object detected is within the plus minus 5 degree window about the zero mark? Why use encoder, when the LIDAR data could be a more reliable closed loop system? Does my question make sense?
That’s cool, respect for your skills, seriously! Only thing is, your robot is only moving around his own axis. It’s basically a turret. When it would drive, it would shoot on anything coming in range wouldn’t it. I don’t think lidar is the right choice for the intended purpose of your robot. The idea at the end with face recognition would make more sense, then the robot could freely move without being stationary and shoot on things in proximity. Still, awesome project!
How about some facial recognition so it knows friend from intruder? Edit, doh, and that is why you should watch the video before commenting.
Learnt the hard way
Please can you explain how you manage to design, develop, print, build and refine such large projects every week? I'm dumbfounded you can manage every week. You must have a large team and multiple 3d printers
So simple, yet so elegant :)
You are amazing my guy, wanna make robots like you
When you have it roaming around seeking to destroy, add some targets to it which temporarily disable it. Then when the pandemic is over, take it to that warehouse, set up some obstacles (cardboard boxes etc.) and invite people to play capture the flag against it. They can try to disable it with their own nerf shots whilst it seeks them out. I'd pay to play that 😂
I was wondering if you can use it to detect movement... But a movement detector is just a pretty basic version of this.
Can you set how hard it fires, depending on the distance to the target?
Also, does it need another Lidar for its movement, avoiding obstacles, and detecting moving targets? Or can it be done with just one, but with a better software?
Depending on all of that, it can be a very cool toy! Add a controller, a camera with VR support, maybe a speaker + mic, and you can sell it as a toy. I'd buy some! :) ;)
On another note: Thank God it doesn't have an AI, or we would have been doomed!!!!! 🤣🤣🤣
Amazed how fast it rotates. It’s a little scary
He should put a small round disk below the li-dar unit to make it look like a top hat.
I badly need this for the mother in law....
Great design
Thanks for sharing👍😀
To improve precision you might need to consider SLAM algorithms. Plan detection and intersection, associated with some IMU that corrects your odometry, can help. Otherwise for less than 180 euros (range price of this simple lidar) you can get a quite precise T265 from intel. It provides you with a good translation vector and rotation quaternion for every move (200 fps). Very handy. It works with a jetson nano.
It wouldn't be too much more work to code a target lock function. I think you'd have better luck with thermal vision over a regular camera though - simpler input and easier to quantify shapes.
Does the LIDAR work only on one level or you can also Tilt it? I think you should use some powerfull SW to make use of the full LIDAR, to be able to build a map of the surrounding that is continually updated, otherwise the Robot is quite dumb and the power of the LIDAR is not used to the full. Now you are using the LIDAR as it was a directional ultrasound. Also a rotating platform independent for the guns would be much cooler. Nevertheless amazing project and also inspirational!
This one is one laser line, you can get '3D' versions
Amazing project. Keep up the good work!
If you use the Jetson board its not just ROS youl be interested in, you might do some investigation into TensorFlow and Keras. ROS and the Machine Learning software support Python which is great for prototyping but youl want to move these over to C++ ultimately to get the best responses and accuracy.
Love this project, but Jody Williams is coming for you James !
Ever had weird glitches? Tip, turn off interrupts if you do anything with interrupt variables. R/W is not atomic aka they can (and Murphy tells us they will) change halve way trough a R/W. And yeah, arrays xD
It's not a clip, it's a magazine...
This thing is awesome though!
A small improvement could be to start the guns spinning up on detection, not after aiming. Might fire a little faster that way.
7-year old me would love this, current me also loves this!
Me as 7 year old..."what's your name".. the current me..."what's your name"..🤗
Nice job, inspiring as always.
hi james you could turn it into a small Dalek from dr who that would be cool to see
You should add some N-Strike ELITE round ammo drums to it. No design changes needed and you’d have 50 shots 😉
My mans is now literally the TF2 engineer.
He built a sentry
Hey James, would it be possible to use the lidar to do 3D scans of objects, rather than mapping a room? I'm envisioning a vertical axis that the lidar would move up and down on, coupled with a rotating platform to put the object on. So the scanner would start at the top or bottom of the vertical axis, the object would do a full 360 rotation while the lidar is scanning, then the scanner would move 1 increment and start the cycle again.
great video, love your channel. Would love to see parts lists in your comment section, So for example a link to buy the motors or the lidar. thanks heaps.
James,
What is your preferred infill and filament material for your 3D prints? I'd like to try prototyping parts like you do for the most structurally sound framework.
Probably 4 perimeters and about 20-30%, that's with a 0.5mm nozzle though
This guy is gonna take down a whole military with Nerf darts
another great video! would love to see you improve it by making it roam around a room or something!
how long did it take to 3d print the parts. very interesting could be used to keep the cat out of the vegetable patch.
You should attempt to make the closest equivalent to the “remote control Ironman armor” in Ironman 3. Make two suit. One that’s almost a eco skeleton that can “sence your movement”. Then the second suit that’s a full robot with the full range of motion as a human. So the “robot suit” move how you move when you wear the “exo skeleton” that’ll be awesome dude!!
James, do you understand that the 'serial.print' command is very slow and shouldn't be used if high speed timing is required. Take all serial commands out of the loop that reads or uses the Lidar. That's why you got such messy output on your original test script.
Another good example of what NOT to do.
Are you able to poll the lidar sensor in a timer interrupt or does the library support interrupts? It would likely help you get more consistent values if the code reading your sensor was deterministic.
That might be what the start bit us for?
I could do with one of these in my garden to keep the beighbours' cats off my flower and vegetable beds! :-)
if your goal is to scare off cats, replace the nerf guns with water pistols. XD