This is honestly amazing. Your projects are really inspiring! I’ve been working on a little flight controller using an Arduino Nano for a fixed wing configuration; nothing anywhere near close to what you have done, but your videos and tutorials have served as inspiration and guidance. Speaking of fixed wing, do you think this could be integrated into that type of configuration to do something like auto landing? I also noticed that you only have an IMU on the Teensy flight controller, but would it help the controller if you had, say something like a BMP 280 and an HMC5883 for both altitude and heading? (I could be wrong but didn’t see any other sensors in the footage). Thank you for sharing your projects! Not only are they amazing, they’re also inspiring. Keep them coming!
@@whizadree Replacing my ‘cheaty’ April tag tracking with some sort of machine learning model would absolutely be the next step, and would be a straightforward drop-in replacement
I wonder how difficult it would be to package an obstacle avoidance and planning system like this to work directly with ardupilot. If it could somehow update mission nodes on the fly like this it could be used with multiple different platforms in an easy and accessible way to retrofit mission planning and optimization to your diy project. Not to mention to be able to take full advantage of the capabilities of ardupilot in addition. Awesome vid man
Fun Fact: ArduPilot runs Dijkstra's algorithm and another path planner in realtime. It uses a variety of inputs like multiple lidar/sonars, pre-programmed polygons (like trees and buildings), or other dynamically moving aircraft such as ADSB and/or other drones on the same network.
Interesting, I knew they used dijkstra’s for planning around a pre-defined weirdly-shaped fence, but have not seen any of the capabilities for dynamic/injected obstacles. Will have to look into it more. Thanks!
@@NicholasRehm Back in 2018 dynamic planning of paths and avoiding dynamic obstacles was already used and shown. I remembered this demo @Tom Pittenger ua-cam.com/video/VHeDL85iWdI/v-deo.html
Apart from Path Planning, (we call our Algorithm BendyRuler, that does dynamic obstacle avoidance), we also support 3-D obstacle avoidance in manual modes: ua-cam.com/video/-6PWB52J3ho/v-deo.html
Autopilot drone races would be an interesting competition, maybe you can host an event? Maybe of course that has constantly changing obstacles, with a flight ceiling that would disqualify those that flew above it more than a number of times (1-3). Would bring nerdy back into drone racing, well encouraging the improvement/evolution of flight controllers.
Watching this made me remember the "Kalman filter" I heard about some time ago that allows to combine accelerator and GPS data to improve the accuracy of the "current position estimation". You could probably do something similar by having some of these tags placed at fixed known positions, so the drone can "realign" its position without relying on GPS data.
Very nice explanation on how it works. Just getting into this topic area and will go back through this video to get more details on your implementation. I want to do something similar to my r/c lawn-mower and considering ardurover, but this approach combined with path planning/obstacle avoidance would be a very good approach.
I’d recommend a hybrid approach where you keep ardurover for basically everything, and supplement it with a companion computer to send it new waypoints/desired state data over mavlink in something like guided mode
This is so deeply nerdy and even with your excellent assistance remained over my head in a GREAT way. Being over my head gives me something to reach for. Great video, even more detail in the coding process would be cool. I love it.
Andrew Barry (Boston Dynamics, MIT) had created Pushbroom Stereo like 7/8 years ago, now it should be quite feasible without 120FPS cameras or stereo pair since we have decent monocular depth estimation sorted!
This is all very interesting. I'm very grateful to have came across this channel. Please keep up the great work and content. Thanks for sharing. Thumbs up!! 👍
The drone knows where it is at all times. It knows this because it knows where it isn't. By subtracting where it is from where it isn't, or where it isn't from where it is, whichever is greater, it obtains a difference, or deviation. The guidance subsystem uses deviations to generate corrective commands to drive the drone from a position where it is to a position where it isn't and, arriving at a position where it wasn't, it now is. Consequently, the position where it is, is now the position that it wasn't, and it follows that the position that it was is now the position that it isn't. In the event that the position that it is in is not the position that it wasn't, the system has acquired a variation; the variation being the difference between where the drone is and where it isn't. If variation is considered to be a significant factor, it, too, may be corrected by the GPS. However, the drone must also know where it was. The drone guidance computer scenario works as follows: because a variation has modified some of the information the drone has obtained, it is not sure just where it is, however it is sure where it isn't, within reason, and it knows where it was. It now subtracts where it should be from where it wasn't, or vice versa. And by differentiating this from the algebraic sum of where it shouldn't be and where it was, it is able to obtain the deviation and its variation, which is called error.
The drone knows where it is at all times. It knows this because it knows where it isn't. By subtracting where it is from where it isn't, or where it isn't from where it is (whichever is greater), it obtains a difference, or deviation. The guidance subsystem uses deviations to generate corrective commands to drive the drone from a position where it is to a position where it isn't, and arriving at a position where it wasn't, it now is. Consequently, the position where it is, is now the position that it wasn't, and it follows that the position that it was, is now the position that it isn't. In the event that the position that it is in is not the position that it wasn't, the system has acquired a variation, the variation being the difference between where the drone is, and where it wasn't. If variation is considered to be a significant factor, it too may be corrected by the GEA. However, the drone must also know where it was. The drone guidance computer scenario works as follows. Because a variation has modified some of the information the drone has obtained, it is not sure just where it is. However, it is sure where it isn't, within reason, and it knows where it was. It now subtracts where it should be from where it wasn't, or vice-versa, and by differentiating this from the algebraic sum of where it shouldn't be, and where it was, it is able to obtain the deviation and its variation, which is called error.
Wow! Great video. Are you using machine learning, computer vision to recognize an object or a person? You can make a legit drone home security system. All you need is a wireless landing pad haha. Can't wait for your other videos. Thank again!
Wow what a great video with simple explanations great work buddy you deserve millions subscribers, highly recommending your channel and I actually enjoying watching your clips with fruitful info's.. .. Thanks appreciated 👍👍👍
Good smarts shown in your project, but I have a basic question. Why are you using the optical sensor instead of, say, a front facing Lidar sensor, so that essentially you could maneuver in dark spaces. It looks like you could only fly this in daylight.
Thank you for demonstrating yet again that engineers can do anything . Also could you point to resources and courses(if any) as to where to learn more about ROS, computer vision or basically how to make drones smart.
This is the class I took, all lectures and course material is free and open source: pear.wpi.edu/teaching/rbe595/fall2023.html I even gave a guest lecture last semester :)
Nice job! I am working on something very similar. May I ask a couple of questions? Are you sending position setpoints, velocity setpoints, or acceleration setpoints? Or a combination thereof? Are you using a downward facing optical flow connected directly to the px4? Are you using the px4 EKF2 estimator? Are you generating a map of the space that is stored for later, maybe using something like rtab_map?
The quad is stabilized at desired angle setpoints using my opensource teensy-based flight controller dRehmFlight VTOL. The pi is sending angle setpoint commands based on position feedback from the realsense camera and a cascaded PID controller I wrote. The px4flow sensor you saw on the underside is unused for this project; I was experimenting with my own visual-inertial odometry implementation before just settling on the realsense camera for odometry. No px4 or other pre-existing packages were used for this project other than the apriltag tracker which was a ROS package I borrowed. Best of luck with your project!
Very cool work. I've been thinking about doing something like this for fun/hobby. How are you controlling the quad from your computer? I've been trying to figure out this simple first step. Do you add wifi or something to your teensy? Or, are you broadcasting a signal directly to your quad receiver like your computer is a transmitter? The computer transmitting directly to the quad receiver is what I've been trying to figure out.
Excellent project, I love seeing people using ros. You should give ros2 a try, the architecture is substantially more robust. On to the questions: What was the load like on the pi? Why two cameras? You mentioned moving away from the april tags to optical object detection. Have you considered using the depth data from the realsense instead of optical? What is the battery connector on post at the back for?
1. About 60% cpu load, mostly from the usb camera driver/AprilTag tracking 2. The realsense camera is super wide angle and relatively low resolution, so its hard to remove image distortion for tracking algorithms to use. So it ended up just being easier to throw on a cheap HD usb camera for the target tracking 3. I don't know what a more robust obstacle detection scheme would look like at the moment, but it definitely could leverage the realsense data! 4. This quad was initially built a while back for a competition which required a disarming key X inches away from the propellers. That connector just completes the connection between the ESCs and propulsion battery with a shorted XT-60 connector
Make sure your collar’s straight the next time you do anything like this again. 😉 Great video with a clear explanation of technical data, sweet video. I’m eager to know where this goes. 😊
Great video! thanks a lot for sharing this with us :-) Are you still using your custom flight software dRhemFlight? I am using it as well in a VTOL project. How do you plan to integrate with currently available path planning software? Will you for example implement a MAVLink communication or will you eventually switch to PX4 / Ardupilot? Your firmware is great to get started as it is intuitive, but I realized that it has some limitations when it comes to interfacing with some of the opensource software out there. Do you plan to switch firmware, do you re-implement your own path planning / ground station software or do you aim to make your firmware compatible with current software? What is your take on that? I find it very difficult to decide which direction to go. Reimplementing stuff is great if you want to learn how things work, on the other hand using available software as much as possible accelerates the development. Thanks a ton for your answer.
Yea, dRehmFlight for the inner loop stabilization but none of the autonomy. All of that was done on the pi in ROS and was also all custom. I think dRehmFlight is going to stay just about what it is in its current state so it can stay more of a teaching/learning tool and less of a competitor in the heavily developed autopilot market. The beauty of it in my opinion is just how fast you can splice in some simple code to get custom functionality, but of course the drawback is the lack of heavily developed features like autonomy or immediate compatibility with other systems
I beg you, build Justin Capra's YR-VVI, that should mean you no longer need a propeller, I'd love to see a thing like that fly, even if in model version. If there's one person who can do it, it's you. Although it won't stop me from trying it myself in the future with my basic RC planes and drones building experience XD
I love it! Civ Div channel just did a video where he talks about bringing a DJI mavic with him into Ukraine, and how they use jammers and so their cheap drones just fall out of the sky so for the money he’d rather invest in better boots. But with the some hardware and some hacking on INav these cheap drones could just return to home instead.
Amazing project and video! Very clear explanation on how you drone roughly works I do have a question though that you may have possibly answered throughout the video but it seems like I missed it. At 11:45, why did the drone backtrack and took the longer path to the right of the obstacles rather than going left from the left most obstacle? is it because we predefined the area at which nodes and edges can exist?
Correct, the grid overlay showed viable nodes/edges that it could plan over, which was just an arbitrary rectangle within my backyard (with margin so I wouldn’t hit the tree!)
@@NicholasRehm Then you have to deal with the "negative" obstacles, too (I have a cliff-like drop off on two sides of my field). Time for a "It's the end of the world as we know it" algo. :-)
I just stumbled on this video. I am very impressed with your demonstration of DIY drone autonomous flight pattern. I am looking for similar drone navigation system without the use of GPS. I am planning to scan a cell tower using a cylenderical coordinate system using 4 ground targets as a mean to position the drone on a cylindrical grid and collect RF signals. The position accuracy could be in mm. How would you go about doing such a design? Please send your contact info. Thanks
Legendary effort! Very VERY cool! A great production! Remember watching Terminator as a kid? I think this was just as exciting. :D :D :D Bravo!!!!!!!! Well done!!!!!!
Could you share the ROS source code for this ? what ros nodes and ros topics do you use. Because i'm reaserching the same area right now, it will be helpful if i know the references for what i'm doing now. Thank you
I tell it the start and goal point and it’ll fly to the start on takeoff, wait till I let everything start running, and it figures out how to get to the goal location. Manually defining the start and goal node location is part of the motion planning algorithm, kinda like asking google maps to plan a route for you
Firstly I have love the hardwork for this video. Do you have any blog or document on this. I'm collage student & I want to build exactly the same with ros2 and jetson. And thats my dream 😅. I need some guidence pls.
hey, Thats good. One small question , there is freedom of movement on top right?. Why cant we make the drone fly bit high to avoid obstacle instead going around it ? If thats the case, let me know the procedure?
My biggest hurdle: ROS. Tried getting into it on an NVidia Jetson board, and also had a Teensy 4 as my uart interface with my rover. Guess I have to give it another try some time. Did you come across any helpful starting resources for ROS?
Starting from a raspberry pi image from ubiquity robotics with a clean install of ROS + dependencies was a lifesaver. Then going in and making your own “templates” for each node is really helpful to get an understanding of how data is handled by ROS. Once you get all the ROS topics into and out of each script, writing the code is the easy/fun part
Hi ! This is cool and all but can I ask a question? How does your companion computer, in this case, your Rpi, communicate with the Flight Controller of the drone.
Have you considered using an NVIDIA Jetson board? (Disclaimer: I work there!) A bit more expensive but a lot more powerful and pretty power-efficient, given how much planning computation you're going you might benefit from that. Works well with ROS 2 as well (check out NVIDIA Isaac ROS) for superfast AprilTag detection, object detection, depth segmentation, the list goes on... Could be a cool add-on :)
On a scale of “horribly underwhelming” to “ehh it’s alright,” how did you like this project?
This is honestly amazing. Your projects are really inspiring! I’ve been working on a little flight controller using an Arduino Nano for a fixed wing configuration; nothing anywhere near close to what you have done, but your videos and tutorials have served as inspiration and guidance. Speaking of fixed wing, do you think this could be integrated into that type of configuration to do something like auto landing? I also noticed that you only have an IMU on the Teensy flight controller, but would it help the controller if you had, say something like a BMP 280 and an HMC5883 for both altitude and heading? (I could be wrong but didn’t see any other sensors in the footage). Thank you for sharing your projects! Not only are they amazing, they’re also inspiring. Keep them coming!
Wish you could do it without markers so be able to say oh it's a person, tree , truck ect, then test it on real world objects in motion
@@spaceshuttle8332 Thanks so much, and best of luck on your project! I love the idea of auto landing a plane…
@@whizadree Replacing my ‘cheaty’ April tag tracking with some sort of machine learning model would absolutely be the next step, and would be a straightforward drop-in replacement
I wonder how difficult it would be to package an obstacle avoidance and planning system like this to work directly with ardupilot. If it could somehow update mission nodes on the fly like this it could be used with multiple different platforms in an easy and accessible way to retrofit mission planning and optimization to your diy project. Not to mention to be able to take full advantage of the capabilities of ardupilot in addition. Awesome vid man
Great stuff! The forced 'long way around' test was awesome!!
I was thinking of u when watching this video where u havee been i hope everything is ok👍
Fun Fact: ArduPilot runs Dijkstra's algorithm and another path planner in realtime. It uses a variety of inputs like multiple lidar/sonars, pre-programmed polygons (like trees and buildings), or other dynamically moving aircraft such as ADSB and/or other drones on the same network.
Interesting, I knew they used dijkstra’s for planning around a pre-defined weirdly-shaped fence, but have not seen any of the capabilities for dynamic/injected obstacles. Will have to look into it more. Thanks!
@@NicholasRehm Back in 2018 dynamic planning of paths and avoiding dynamic obstacles was already used and shown. I remembered this demo @Tom Pittenger
ua-cam.com/video/VHeDL85iWdI/v-deo.html
Apart from Path Planning, (we call our Algorithm BendyRuler, that does dynamic obstacle avoidance), we also support 3-D obstacle avoidance in manual modes: ua-cam.com/video/-6PWB52J3ho/v-deo.html
Autopilot drone races would be an interesting competition, maybe you can host an event? Maybe of course that has constantly changing obstacles, with a flight ceiling that would disqualify those that flew above it more than a number of times (1-3). Would bring nerdy back into drone racing, well encouraging the improvement/evolution of flight controllers.
5:55 That's what Blue Origin said! Great video!
I was planning on doing something like this in my quest to build a smarter robotic mower, but you just blew my mind doing it with a drone! Lol!
This Channel will grow massively
Engineers really love to mess with stuff that can be revolutionized during their free time
Very well planned and executed, thanks for sharing. Greetings from Chile... ...and the Mythbusters poster in the end... Subscribed!
Watching this made me remember the "Kalman filter" I heard about some time ago that allows to combine accelerator and GPS data to improve the accuracy of the "current position estimation". You could probably do something similar by having some of these tags placed at fixed known positions, so the drone can "realign" its position without relying on GPS data.
Yup, factory robots use these a lot in warehouses to correct for drift in the kalman state estimate, since the tags are always in known position
Cyclo drone i loved
I caught a few of the Extremely subtle hints to subscribe, and did so. Thanks for the good content.
Glad my subliminal messaging worked lol
Great development. Thanks for sharing.
you are a genius, greetings from spain
This blew my mind, it's truly DIY legendary. I'm new with Drone,ROS,... I'm starting to learn this stuffs all of your projects inspired me a lot
2:30 "the drone knows where it is at all times 🧐🧐"
By knowing where it isn’t
I love that you set up a full desktop outside rather than just dinking around on a laptop.
Very nice explanation on how it works. Just getting into this topic area and will go back through this video to get more details on your implementation. I want to do something similar to my r/c lawn-mower and considering ardurover, but this approach combined with path planning/obstacle avoidance would be a very good approach.
I’d recommend a hybrid approach where you keep ardurover for basically everything, and supplement it with a companion computer to send it new waypoints/desired state data over mavlink in something like guided mode
This is so deeply nerdy and even with your excellent assistance remained over my head in a GREAT way.
Being over my head gives me something to reach for.
Great video, even more detail in the coding process would be cool. I love it.
Really appreciate it, I’ll try to find better ways in the future to make the coding bits less boring and include more of that
@@NicholasRehm You do you man. Don't worry about the algorithm or us. Make what makes you happy. We're happy to get to share it
2:15 The missile always knows where it is...
This is so cool. Very well explained
Andrew Barry (Boston Dynamics, MIT) had created Pushbroom Stereo like 7/8 years ago, now it should be quite feasible without 120FPS cameras or stereo pair since we have decent monocular depth estimation sorted!
Great video, and here I was happy to have a quad running inav rth haha
I appreciate all the details!!
Immensely complex and high risk :)
Damn dude. Really interesting and mindblowing stuff. Subscribed! :)
Love it
WOW, keep up the great work!
This is all very interesting. I'm very grateful to have came across this channel.
Please keep up the great work and content. Thanks for sharing.
Thumbs up!! 👍
It's pretty darn cool really really
Amazing project and explanation! Keep the good work coming. 👍🏻 You earned yourself a new subscriber right there! 😉
Lmao “immensely complex and high risk”
The drone knows where it is at all times. It knows this because it knows where it isn't. By subtracting where it is from where it isn't, or where it isn't from where it is, whichever is greater, it obtains a difference, or deviation. The guidance subsystem uses deviations to generate corrective commands to drive the drone from a position where it is to a position where it isn't and, arriving at a position where it wasn't, it now is. Consequently, the position where it is, is now the position that it wasn't, and it follows that the position that it was is now the position that it isn't. In the event that the position that it is in is not the position that it wasn't, the system has acquired a variation; the variation being the difference between where the drone is and where it isn't. If variation is considered to be a significant factor, it, too, may be corrected by the GPS. However, the drone must also know where it was. The drone guidance computer scenario works as follows: because a variation has modified some of the information the drone has obtained, it is not sure just where it is, however it is sure where it isn't, within reason, and it knows where it was. It now subtracts where it should be from where it wasn't, or vice versa. And by differentiating this from the algebraic sum of where it shouldn't be and where it was, it is able to obtain the deviation and its variation, which is called error.
AMEN
The drone knows where it is at all times. It knows this because it knows where it isn't. By subtracting where it is from where it isn't, or where it isn't from where it is (whichever is greater), it obtains a difference, or deviation. The guidance subsystem uses deviations to generate corrective commands to drive the drone from a position where it is to a position where it isn't, and arriving at a position where it wasn't, it now is. Consequently, the position where it is, is now the position that it wasn't, and it follows that the position that it was, is now the position that it isn't.
In the event that the position that it is in is not the position that it wasn't, the system has acquired a variation, the variation being the difference between where the drone is, and where it wasn't. If variation is considered to be a significant factor, it too may be corrected by the GEA. However, the drone must also know where it was.
The drone guidance computer scenario works as follows. Because a variation has modified some of the information the drone has obtained, it is not sure just where it is. However, it is sure where it isn't, within reason, and it knows where it was. It now subtracts where it should be from where it wasn't, or vice-versa, and by differentiating this from the algebraic sum of where it shouldn't be, and where it was, it is able to obtain the deviation and its variation, which is called error.
This is exactly what my partner and I are working on for our final year project.
Wow! Great video. Are you using machine learning, computer vision to recognize an object or a person?
You can make a legit drone home security system.
All you need is a wireless landing pad haha.
Can't wait for your other videos.
Thank again!
Great work !
That is an awesome project..... well done..
Wow what a great video with simple explanations great work buddy you deserve millions subscribers, highly recommending your channel and I actually enjoying watching your clips with fruitful info's.. .. Thanks appreciated 👍👍👍
Thanks so much for the kind words. You're exactly the type of person I try to make videos for
Not an engineer, but it is awesome and really well explained. I definitely learned something interesting 😀
Glad to hear it!
Brilliant.
Good smarts shown in your project, but I have a basic question. Why are you using the optical sensor instead of, say, a front facing Lidar sensor, so that essentially you could maneuver in dark spaces. It looks like you could only fly this in daylight.
very beautiful but need faster hardware like nvidia jetson or orin...
The drone knows where it is at all times. It knows this because it knows where it isn't.
I find it insane how you made your own FC ... code and all.
Thank you for demonstrating yet again that engineers can do anything . Also could you point to resources and courses(if any) as to where to learn more about ROS, computer vision or basically how to make drones smart.
This is the class I took, all lectures and course material is free and open source: pear.wpi.edu/teaching/rbe595/fall2023.html I even gave a guest lecture last semester :)
Nice job! I am working on something very similar. May I ask a couple of questions? Are you sending position setpoints, velocity setpoints, or acceleration setpoints? Or a combination thereof? Are you using a downward facing optical flow connected directly to the px4? Are you using the px4 EKF2 estimator? Are you generating a map of the space that is stored for later, maybe using something like rtab_map?
The quad is stabilized at desired angle setpoints using my opensource teensy-based flight controller dRehmFlight VTOL. The pi is sending angle setpoint commands based on position feedback from the realsense camera and a cascaded PID controller I wrote. The px4flow sensor you saw on the underside is unused for this project; I was experimenting with my own visual-inertial odometry implementation before just settling on the realsense camera for odometry. No px4 or other pre-existing packages were used for this project other than the apriltag tracker which was a ROS package I borrowed. Best of luck with your project!
Very cool work. I've been thinking about doing something like this for fun/hobby. How are you controlling the quad from your computer? I've been trying to figure out this simple first step. Do you add wifi or something to your teensy? Or, are you broadcasting a signal directly to your quad receiver like your computer is a transmitter? The computer transmitting directly to the quad receiver is what I've been trying to figure out.
nice bro
that was amaising, thanks!
The drone knows where it is at all times, by knowing where it isn't.
Excellent project, I love seeing people using ros. You should give ros2 a try, the architecture is substantially more robust.
On to the questions:
What was the load like on the pi?
Why two cameras?
You mentioned moving away from the april tags to optical object detection. Have you considered using the depth data from the realsense instead of optical?
What is the battery connector on post at the back for?
1. About 60% cpu load, mostly from the usb camera driver/AprilTag tracking
2. The realsense camera is super wide angle and relatively low resolution, so its hard to remove image distortion for tracking algorithms to use. So it ended up just being easier to throw on a cheap HD usb camera for the target tracking
3. I don't know what a more robust obstacle detection scheme would look like at the moment, but it definitely could leverage the realsense data!
4. This quad was initially built a while back for a competition which required a disarming key X inches away from the propellers. That connector just completes the connection between the ESCs and propulsion battery with a shorted XT-60 connector
Make sure your collar’s straight the next time you do anything like this again. 😉 Great video with a clear explanation of technical data, sweet video. I’m eager to know where this goes. 😊
If possible can you make another follow up video where you go more in the algorithms?
The NASA team that wrote the motion planning software for the autonomous Mars Rover, used the A* algorithm.
This is so cool! I've been trying to do stuff like this but I've yet to gather all the software-related knowledge to achieve things like this.
It's an awesome project! Great job! Thanks for sharing with us.
Great video! thanks a lot for sharing this with us :-) Are you still using your custom flight software dRhemFlight? I am using it as well in a VTOL project. How do you plan to integrate with currently available path planning software? Will you for example implement a MAVLink communication or will you eventually switch to PX4 / Ardupilot? Your firmware is great to get started as it is intuitive, but I realized that it has some limitations when it comes to interfacing with some of the opensource software out there. Do you plan to switch firmware, do you re-implement your own path planning / ground station software or do you aim to make your firmware compatible with current software? What is your take on that? I find it very difficult to decide which direction to go. Reimplementing stuff is great if you want to learn how things work, on the other hand using available software as much as possible accelerates the development. Thanks a ton for your answer.
Yea, dRehmFlight for the inner loop stabilization but none of the autonomy. All of that was done on the pi in ROS and was also all custom. I think dRehmFlight is going to stay just about what it is in its current state so it can stay more of a teaching/learning tool and less of a competitor in the heavily developed autopilot market. The beauty of it in my opinion is just how fast you can splice in some simple code to get custom functionality, but of course the drawback is the lack of heavily developed features like autonomy or immediate compatibility with other systems
2:16
The missile knows where it is at all times
may I know where I can find the circuit schematics ,code etc for this project , it would be really helpful for me
Its a good level of weeds for me. well done video.
I beg you, build Justin Capra's YR-VVI, that should mean you no longer need a propeller, I'd love to see a thing like that fly, even if in model version.
If there's one person who can do it, it's you. Although it won't stop me from trying it myself in the future with my basic RC planes and drones building experience XD
I love it! Civ Div channel just did a video where he talks about bringing a DJI mavic with him into Ukraine, and how they use jammers and so their cheap drones just fall out of the sky so for the money he’d rather invest in better boots. But with the some hardware and some hacking on INav these cheap drones could just return to home instead.
Eh it's alright!
Just recently found this channel and going to give some of this stuff a try!
Amazing! I wonder if the integration with mavros or the ROS wrapper used, is available for the dRehmFlight software?
DRehmFlight doesn’t support mavlink
Where are you from? I’ve got some motors and frames I can send you.
Lovely lovely
Awesome ❣️🔥
I'm attracted to he idea of having a quad flying through my woods doing my patrol for me. ;)
Amazing project and video! Very clear explanation on how you drone roughly works
I do have a question though that you may have possibly answered throughout the video but it seems like I missed it.
At 11:45, why did the drone backtrack and took the longer path to the right of the obstacles rather than going left from the left most obstacle? is it because we predefined the area at which nodes and edges can exist?
Correct, the grid overlay showed viable nodes/edges that it could plan over, which was just an arbitrary rectangle within my backyard (with margin so I wouldn’t hit the tree!)
@@NicholasRehm ah yes that makes alot of sense! Thank you for the hasty reply
what are the applications that you are using the test your drones
Graph theory?
Subbed!
hell yea
Hi! That was entertaining and very interesting. Thank you for making this video. I should put this on my pretty dumb “robotic” lawn mower.
I have definitely thought of doing an automated lawnmower, haha. Thanks for the comment
@@NicholasRehm Then you have to deal with the "negative" obstacles, too (I have a cliff-like drop off on two sides of my field). Time for a "It's the end of the world as we know it" algo. :-)
I just stumbled on this video. I am very impressed with your demonstration of DIY drone autonomous flight pattern. I am looking for similar drone navigation system without the use of GPS. I am planning to scan a cell tower using a cylenderical coordinate system using 4 ground targets as a mean to position the drone on a cylindrical grid and collect RF signals. The position accuracy could be in mm. How would you go about doing such a design? Please send your contact info. Thanks
Hey Nicholas! I'm working on Motion Planning for UAVs using Motion Capture and I really need your help! Can we have a discussion please??
Legendary effort! Very VERY cool! A great production! Remember watching Terminator as a kid? I think this was just as exciting. :D :D :D Bravo!!!!!!!! Well done!!!!!!
Is there any way to get a part list, or step by step guide?
hey i just learned control system and state space , was wondering on how will i feed refernce path so that my mpc can make uav follow it
I have problem in f35 motor calibration please help
The missile knows where it is.
_"...unless you really like pain."_
THAT is why I just settle for watching videos of this stuff...😉
What are you doing in switzerland? (1:22)
btw, very helpful video, I'm planning to do something similar soon
Default SITL location haha
What animation software is used in 00:56 How Waypoint Autonomy Works
TLDW: The missile knows.
Could you share the ROS source code for this ? what ros nodes and ros topics do you use. Because i'm reaserching the same area right now, it will be helpful if i know the references for what i'm doing now. Thank you
this would be great for a lawnmower mod, please make one
Very nice video, but how much is the rate of both apriltag detector and dijkstra algorithm ? Considering it's very computational heavy
About 10 hz on both of them
Hi, great video! What softwares have you used in this video??
DRehmFlight and ROS
How does the drone know its starting and goal point? How are these nodes and transfer points defined in the implementation?
I tell it the start and goal point and it’ll fly to the start on takeoff, wait till I let everything start running, and it figures out how to get to the goal location. Manually defining the start and goal node location is part of the motion planning algorithm, kinda like asking google maps to plan a route for you
r u made this flight controller ?? how ??
Love you man ❤️😘
Firstly I have love the hardwork for this video.
Do you have any blog or document on this. I'm collage student & I want to build exactly the same with ros2 and jetson. And thats my dream 😅. I need some guidence pls.
If you shoot me an email I can send you a paper from this project
Amazing
hey, Thats good. One small question , there is freedom of movement on top right?. Why cant we make the drone fly bit high to avoid obstacle instead going around it ? If thats the case, let me know the procedure?
I constrained the problem to a 2D plane. What if it were in a cave and couldn’t go over?
@@NicholasRehm Right. How to overcome that?
My biggest hurdle: ROS.
Tried getting into it on an NVidia Jetson board, and also had a Teensy 4 as my uart interface with my rover.
Guess I have to give it another try some time. Did you come across any helpful starting resources for ROS?
Starting from a raspberry pi image from ubiquity robotics with a clean install of ROS + dependencies was a lifesaver. Then going in and making your own “templates” for each node is really helpful to get an understanding of how data is handled by ROS. Once you get all the ROS topics into and out of each script, writing the code is the easy/fun part
which simulation software is used here to replicate real drone visuals ..?
Matlab
px4 has the local planner module to acheive the same thing
somebody had to implement that, too
hello Sir, I'm trying to basic quadcopter with nodemcu...I need some help. Strugling with PID tuning
Try adjusting the gains
Hi ! This is cool and all but can I ask a question? How does your companion computer, in this case, your Rpi, communicate with the Flight Controller of the drone.
Just a basic serial connection
Have you considered using an NVIDIA Jetson board? (Disclaimer: I work there!)
A bit more expensive but a lot more powerful and pretty power-efficient, given how much planning computation you're going you might benefit from that. Works well with ROS 2 as well (check out NVIDIA Isaac ROS) for superfast AprilTag detection, object detection, depth segmentation, the list goes on... Could be a cool add-on :)
Definitely on the list for a future build
@@NicholasRehm awesome, can't wait :)