GPS-Denied, Anti-Jam Autonomous Drone: How It Works

Поділитися
Вставка
  • Опубліковано 2 лют 2025

КОМЕНТАРІ • 252

  • @NicholasRehm
    @NicholasRehm  3 роки тому +168

    On a scale of “horribly underwhelming” to “ehh it’s alright,” how did you like this project?

    • @spaceshuttle8332
      @spaceshuttle8332 3 роки тому +21

      This is honestly amazing. Your projects are really inspiring! I’ve been working on a little flight controller using an Arduino Nano for a fixed wing configuration; nothing anywhere near close to what you have done, but your videos and tutorials have served as inspiration and guidance. Speaking of fixed wing, do you think this could be integrated into that type of configuration to do something like auto landing? I also noticed that you only have an IMU on the Teensy flight controller, but would it help the controller if you had, say something like a BMP 280 and an HMC5883 for both altitude and heading? (I could be wrong but didn’t see any other sensors in the footage). Thank you for sharing your projects! Not only are they amazing, they’re also inspiring. Keep them coming!

    • @whizadree
      @whizadree 3 роки тому +1

      Wish you could do it without markers so be able to say oh it's a person, tree , truck ect, then test it on real world objects in motion

    • @NicholasRehm
      @NicholasRehm  3 роки тому +6

      @@spaceshuttle8332 Thanks so much, and best of luck on your project! I love the idea of auto landing a plane…

    • @NicholasRehm
      @NicholasRehm  3 роки тому +5

      @@whizadree Replacing my ‘cheaty’ April tag tracking with some sort of machine learning model would absolutely be the next step, and would be a straightforward drop-in replacement

    • @robelefrem9734
      @robelefrem9734 3 роки тому +4

      I wonder how difficult it would be to package an obstacle avoidance and planning system like this to work directly with ardupilot. If it could somehow update mission nodes on the fly like this it could be used with multiple different platforms in an easy and accessible way to retrofit mission planning and optimization to your diy project. Not to mention to be able to take full advantage of the capabilities of ardupilot in addition. Awesome vid man

  • @iforce2d
    @iforce2d 3 роки тому +67

    Great stuff! The forced 'long way around' test was awesome!!

    • @goobisoft4873
      @goobisoft4873 3 роки тому +2

      I was thinking of u when watching this video where u havee been i hope everything is ok👍

  • @TomPittenger
    @TomPittenger 3 роки тому +50

    Fun Fact: ArduPilot runs Dijkstra's algorithm and another path planner in realtime. It uses a variety of inputs like multiple lidar/sonars, pre-programmed polygons (like trees and buildings), or other dynamically moving aircraft such as ADSB and/or other drones on the same network.

    • @NicholasRehm
      @NicholasRehm  3 роки тому +6

      Interesting, I knew they used dijkstra’s for planning around a pre-defined weirdly-shaped fence, but have not seen any of the capabilities for dynamic/injected obstacles. Will have to look into it more. Thanks!

    • @luisvale
      @luisvale 3 роки тому

      @@NicholasRehm Back in 2018 dynamic planning of paths and avoiding dynamic obstacles was already used and shown. I remembered this demo @Tom Pittenger
      ua-cam.com/video/VHeDL85iWdI/v-deo.html

    • @rishabhsingh5488
      @rishabhsingh5488 3 роки тому +5

      Apart from Path Planning, (we call our Algorithm BendyRuler, that does dynamic obstacle avoidance), we also support 3-D obstacle avoidance in manual modes: ua-cam.com/video/-6PWB52J3ho/v-deo.html

  • @ITpanda
    @ITpanda 3 роки тому +10

    Autopilot drone races would be an interesting competition, maybe you can host an event? Maybe of course that has constantly changing obstacles, with a flight ceiling that would disqualify those that flew above it more than a number of times (1-3). Would bring nerdy back into drone racing, well encouraging the improvement/evolution of flight controllers.

  • @epicdaniel508
    @epicdaniel508 3 роки тому +6

    5:55 That's what Blue Origin said! Great video!

  • @TurpInTexas
    @TurpInTexas 3 роки тому +12

    I was planning on doing something like this in my quest to build a smarter robotic mower, but you just blew my mind doing it with a drone! Lol!

  • @pisekanha
    @pisekanha 6 днів тому +1

    Great video!

  • @Blubb3rbub
    @Blubb3rbub 2 роки тому +1

    Watching this made me remember the "Kalman filter" I heard about some time ago that allows to combine accelerator and GPS data to improve the accuracy of the "current position estimation". You could probably do something similar by having some of these tags placed at fixed known positions, so the drone can "realign" its position without relying on GPS data.

    • @NicholasRehm
      @NicholasRehm  2 роки тому +1

      Yup, factory robots use these a lot in warehouses to correct for drift in the kalman state estimate, since the tags are always in known position

  • @Paranalense
    @Paranalense 3 роки тому +4

    Very well planned and executed, thanks for sharing. Greetings from Chile... ...and the Mythbusters poster in the end... Subscribed!

  • @williamgillespie4459
    @williamgillespie4459 2 роки тому +1

    I caught a few of the Extremely subtle hints to subscribe, and did so. Thanks for the good content.

    • @NicholasRehm
      @NicholasRehm  2 роки тому

      Glad my subliminal messaging worked lol

  • @sang84119
    @sang84119 3 роки тому +1

    This blew my mind, it's truly DIY legendary. I'm new with Drone,ROS,... I'm starting to learn this stuffs all of your projects inspired me a lot

  • @ThereAreNoHandlesLeft
    @ThereAreNoHandlesLeft 3 роки тому +1

    This is so deeply nerdy and even with your excellent assistance remained over my head in a GREAT way.
    Being over my head gives me something to reach for.
    Great video, even more detail in the coding process would be cool. I love it.

    • @NicholasRehm
      @NicholasRehm  3 роки тому

      Really appreciate it, I’ll try to find better ways in the future to make the coding bits less boring and include more of that

    • @ThereAreNoHandlesLeft
      @ThereAreNoHandlesLeft 3 роки тому +2

      @@NicholasRehm You do you man. Don't worry about the algorithm or us. Make what makes you happy. We're happy to get to share it

  • @moisesaragon4008
    @moisesaragon4008 3 роки тому +1

    you are a genius, greetings from spain

  • @livinginthegardenhouse9755
    @livinginthegardenhouse9755 3 роки тому +1

    This Channel will grow massively

  • @Aaron_b_c
    @Aaron_b_c 2 роки тому +1

    I love that you set up a full desktop outside rather than just dinking around on a laptop.

  • @n0t_UN_Owen
    @n0t_UN_Owen 3 роки тому +5

    Engineers really love to mess with stuff that can be revolutionized during their free time

  • @poporbit2432
    @poporbit2432 3 роки тому +2

    Great development. Thanks for sharing.

  • @iAkashPaul
    @iAkashPaul Рік тому

    Andrew Barry (Boston Dynamics, MIT) had created Pushbroom Stereo like 7/8 years ago, now it should be quite feasible without 120FPS cameras or stereo pair since we have decent monocular depth estimation sorted!

  • @MrMaufera
    @MrMaufera 3 роки тому +2

    Cyclo drone i loved

  • @weirdsciencetv4999
    @weirdsciencetv4999 Місяць тому

    Use YOLO to identify objects, then a range finder to get distance to that object. Then vector correlate it to database established by google street view. Put this camera and rangefinder on a gimbal to scan around. Be sure to properly bore sight camera to the rangefinder.
    Add state estimator to fuse odometry with detected landmarks

  • @lundebc
    @lundebc 3 роки тому +3

    Very nice explanation on how it works. Just getting into this topic area and will go back through this video to get more details on your implementation. I want to do something similar to my r/c lawn-mower and considering ardurover, but this approach combined with path planning/obstacle avoidance would be a very good approach.

    • @NicholasRehm
      @NicholasRehm  3 роки тому +2

      I’d recommend a hybrid approach where you keep ardurover for basically everything, and supplement it with a companion computer to send it new waypoints/desired state data over mavlink in something like guided mode

  • @alphanomega888
    @alphanomega888 Місяць тому

    Excellent work dude!

  • @AERR_FPV
    @AERR_FPV 3 роки тому +2

    Love it

  • @gordon1201
    @gordon1201 3 роки тому +1

    This is so cool. Very well explained

  • @robmulally
    @robmulally 3 роки тому +1

    Great video, and here I was happy to have a quad running inav rth haha

  • @moodberry
    @moodberry 2 роки тому

    Good smarts shown in your project, but I have a basic question. Why are you using the optical sensor instead of, say, a front facing Lidar sensor, so that essentially you could maneuver in dark spaces. It looks like you could only fly this in daylight.

  • @Knutahh
    @Knutahh 3 роки тому +1

    Damn dude. Really interesting and mindblowing stuff. Subscribed! :)

  • @sohaibimran9
    @sohaibimran9 9 днів тому

    totally awesome! but why is that xt-60 connector suspended way above everything else, or is it something else?

    • @NicholasRehm
      @NicholasRehm  9 днів тому +1

      It’s an arming plug

    • @sohaibimran9
      @sohaibimran9 9 днів тому

      @@NicholasRehm yeah, but why is it so high? any specific reason?

  • @manjunathayr9348
    @manjunathayr9348 3 роки тому +2

    It's pretty darn cool really really

  • @omarramadan5185
    @omarramadan5185 Рік тому

    Amazing project and video! Very clear explanation on how you drone roughly works
    I do have a question though that you may have possibly answered throughout the video but it seems like I missed it.
    At 11:45, why did the drone backtrack and took the longer path to the right of the obstacles rather than going left from the left most obstacle? is it because we predefined the area at which nodes and edges can exist?

    • @NicholasRehm
      @NicholasRehm  Рік тому

      Correct, the grid overlay showed viable nodes/edges that it could plan over, which was just an arbitrary rectangle within my backyard (with margin so I wouldn’t hit the tree!)

    • @omarramadan5185
      @omarramadan5185 Рік тому

      @@NicholasRehm ah yes that makes alot of sense! Thank you for the hasty reply

  • @LuigiDaMonster
    @LuigiDaMonster Рік тому

    This is all very interesting. I'm very grateful to have came across this channel.
    Please keep up the great work and content. Thanks for sharing.
    Thumbs up!! 👍

  • @skylerjohnson9089
    @skylerjohnson9089 2 місяці тому

    I appreciate all the details!!

  • @drewskiakg2719
    @drewskiakg2719 2 роки тому +1

    Wow! Great video. Are you using machine learning, computer vision to recognize an object or a person?
    You can make a legit drone home security system.
    All you need is a wireless landing pad haha.
    Can't wait for your other videos.
    Thank again!

  • @kiphaynes
    @kiphaynes 3 роки тому +2

    Nice job! I am working on something very similar. May I ask a couple of questions? Are you sending position setpoints, velocity setpoints, or acceleration setpoints? Or a combination thereof? Are you using a downward facing optical flow connected directly to the px4? Are you using the px4 EKF2 estimator? Are you generating a map of the space that is stored for later, maybe using something like rtab_map?

    • @NicholasRehm
      @NicholasRehm  3 роки тому +2

      The quad is stabilized at desired angle setpoints using my opensource teensy-based flight controller dRehmFlight VTOL. The pi is sending angle setpoint commands based on position feedback from the realsense camera and a cascaded PID controller I wrote. The px4flow sensor you saw on the underside is unused for this project; I was experimenting with my own visual-inertial odometry implementation before just settling on the realsense camera for odometry. No px4 or other pre-existing packages were used for this project other than the apriltag tracker which was a ROS package I borrowed. Best of luck with your project!

  • @overloader7900
    @overloader7900 Рік тому +4

    2:15 The missile always knows where it is...

  • @muessliemix
    @muessliemix 3 роки тому +2

    Amazing project and explanation! Keep the good work coming. 👍🏻 You earned yourself a new subscriber right there! 😉

  • @8bithavok
    @8bithavok 2 роки тому +1

    The drone knows where it is at all times. It knows this because it knows where it isn't. By subtracting where it is from where it isn't, or where it isn't from where it is, whichever is greater, it obtains a difference, or deviation. The guidance subsystem uses deviations to generate corrective commands to drive the drone from a position where it is to a position where it isn't and, arriving at a position where it wasn't, it now is. Consequently, the position where it is, is now the position that it wasn't, and it follows that the position that it was is now the position that it isn't. In the event that the position that it is in is not the position that it wasn't, the system has acquired a variation; the variation being the difference between where the drone is and where it isn't. If variation is considered to be a significant factor, it, too, may be corrected by the GPS. However, the drone must also know where it was. The drone guidance computer scenario works as follows: because a variation has modified some of the information the drone has obtained, it is not sure just where it is, however it is sure where it isn't, within reason, and it knows where it was. It now subtracts where it should be from where it wasn't, or vice versa. And by differentiating this from the algebraic sum of where it shouldn't be and where it was, it is able to obtain the deviation and its variation, which is called error.

  • @SkivaksXD
    @SkivaksXD 2 роки тому

    The drone knows where it is at all times. It knows this because it knows where it isn't. By subtracting where it is from where it isn't, or where it isn't from where it is (whichever is greater), it obtains a difference, or deviation. The guidance subsystem uses deviations to generate corrective commands to drive the drone from a position where it is to a position where it isn't, and arriving at a position where it wasn't, it now is. Consequently, the position where it is, is now the position that it wasn't, and it follows that the position that it was, is now the position that it isn't.
    In the event that the position that it is in is not the position that it wasn't, the system has acquired a variation, the variation being the difference between where the drone is, and where it wasn't. If variation is considered to be a significant factor, it too may be corrected by the GEA. However, the drone must also know where it was.
    The drone guidance computer scenario works as follows. Because a variation has modified some of the information the drone has obtained, it is not sure just where it is. However, it is sure where it isn't, within reason, and it knows where it was. It now subtracts where it should be from where it wasn't, or vice-versa, and by differentiating this from the algebraic sum of where it shouldn't be, and where it was, it is able to obtain the deviation and its variation, which is called error.

  • @bbqotpewi
    @bbqotpewi 3 роки тому +1

    What are you doing in switzerland? (1:22)
    btw, very helpful video, I'm planning to do something similar soon

  • @aaravsingh1736
    @aaravsingh1736 27 днів тому

    I am currently working on a similar project just not on a drone but a model rocket. I only clicked on this video because i saw the "GPS-Denied" part. I see you use stereoscopic cameras for the motion tracking in 3d space. However a question. Since it uses visual features and ques to calculate its translation in real time, how will it work in a place where there arent much of features. Specifically just the sky or over a lake or ocean or something like that. Your project is amazing however just denying gps means that your use case is very limited. This gives you more precision but loses a lot of places where this can be use. Any idea if you could fuse the GPS data and the camera data to get extremely precise position data?

  • @yeshassudharshan8791
    @yeshassudharshan8791 10 місяців тому

    Thank you for demonstrating yet again that engineers can do anything . Also could you point to resources and courses(if any) as to where to learn more about ROS, computer vision or basically how to make drones smart.

    • @NicholasRehm
      @NicholasRehm  10 місяців тому

      This is the class I took, all lectures and course material is free and open source: pear.wpi.edu/teaching/rbe595/fall2023.html I even gave a guest lecture last semester :)

  • @zephyrandboreas
    @zephyrandboreas 3 роки тому +2

    Not an engineer, but it is awesome and really well explained. I definitely learned something interesting 😀

  • @f_2476
    @f_2476 3 роки тому

    Wow what a great video with simple explanations great work buddy you deserve millions subscribers, highly recommending your channel and I actually enjoying watching your clips with fruitful info's.. .. Thanks appreciated 👍👍👍

    • @NicholasRehm
      @NicholasRehm  3 роки тому

      Thanks so much for the kind words. You're exactly the type of person I try to make videos for

  • @hubba002123
    @hubba002123 Рік тому

    Very cool work. I've been thinking about doing something like this for fun/hobby. How are you controlling the quad from your computer? I've been trying to figure out this simple first step. Do you add wifi or something to your teensy? Or, are you broadcasting a signal directly to your quad receiver like your computer is a transmitter? The computer transmitting directly to the quad receiver is what I've been trying to figure out.

  • @MeepMu
    @MeepMu 3 роки тому +3

    Immensely complex and high risk :)

  • @ohitstarik
    @ohitstarik 3 роки тому +3

    2:30 "the drone knows where it is at all times 🧐🧐"

  • @subvsvid3616
    @subvsvid3616 Рік тому

    What animation software is used in 00:56 How Waypoint Autonomy Works

  • @davehayes8812
    @davehayes8812 2 роки тому +1

    Brilliant.

  • @Larock-wu1uu
    @Larock-wu1uu 3 роки тому +1

    Great video! thanks a lot for sharing this with us :-) Are you still using your custom flight software dRhemFlight? I am using it as well in a VTOL project. How do you plan to integrate with currently available path planning software? Will you for example implement a MAVLink communication or will you eventually switch to PX4 / Ardupilot? Your firmware is great to get started as it is intuitive, but I realized that it has some limitations when it comes to interfacing with some of the opensource software out there. Do you plan to switch firmware, do you re-implement your own path planning / ground station software or do you aim to make your firmware compatible with current software? What is your take on that? I find it very difficult to decide which direction to go. Reimplementing stuff is great if you want to learn how things work, on the other hand using available software as much as possible accelerates the development. Thanks a ton for your answer.

    • @NicholasRehm
      @NicholasRehm  3 роки тому +1

      Yea, dRehmFlight for the inner loop stabilization but none of the autonomy. All of that was done on the pi in ROS and was also all custom. I think dRehmFlight is going to stay just about what it is in its current state so it can stay more of a teaching/learning tool and less of a competitor in the heavily developed autopilot market. The beauty of it in my opinion is just how fast you can splice in some simple code to get custom functionality, but of course the drawback is the lack of heavily developed features like autonomy or immediate compatibility with other systems

  • @nbtph9769
    @nbtph9769 4 місяці тому

    If possible can you make another follow up video where you go more in the algorithms?

  • @JohnMatthew1
    @JohnMatthew1 2 роки тому

    WOW, keep up the great work!

  • @RameshPatel-hu6lr
    @RameshPatel-hu6lr Рік тому

    Firstly I have love the hardwork for this video.
    Do you have any blog or document on this. I'm collage student & I want to build exactly the same with ros2 and jetson. And thats my dream 😅. I need some guidence pls.

    • @NicholasRehm
      @NicholasRehm  Рік тому

      If you shoot me an email I can send you a paper from this project

  • @elclay
    @elclay 5 місяців тому

    Amazing! I wonder if the integration with mavros or the ROS wrapper used, is available for the dRehmFlight software?

    • @NicholasRehm
      @NicholasRehm  5 місяців тому

      DRehmFlight doesn’t support mavlink

  • @antoninperbosc1532
    @antoninperbosc1532 3 роки тому +1

    Great work !

  • @minerharry
    @minerharry 2 роки тому +1

    Lmao “immensely complex and high risk”

  • @pshishiraithal6623
    @pshishiraithal6623 10 місяців тому

    may I know where I can find the circuit schematics ,code etc for this project , it would be really helpful for me

  • @fahrulrputra2589
    @fahrulrputra2589 2 роки тому

    Very nice video, but how much is the rate of both apriltag detector and dijkstra algorithm ? Considering it's very computational heavy

  • @Ry-jw8nk
    @Ry-jw8nk 2 роки тому

    How does the drone know its starting and goal point? How are these nodes and transfer points defined in the implementation?

    • @NicholasRehm
      @NicholasRehm  2 роки тому

      I tell it the start and goal point and it’ll fly to the start on takeoff, wait till I let everything start running, and it figures out how to get to the goal location. Manually defining the start and goal node location is part of the motion planning algorithm, kinda like asking google maps to plan a route for you

  • @jdragon8184
    @jdragon8184 2 роки тому

    hey i just learned control system and state space , was wondering on how will i feed refernce path so that my mpc can make uav follow it

  • @wearemany73
    @wearemany73 2 роки тому

    Make sure your collar’s straight the next time you do anything like this again. 😉 Great video with a clear explanation of technical data, sweet video. I’m eager to know where this goes. 😊

  • @bhavyachhabra1600
    @bhavyachhabra1600 2 роки тому

    Hi, great video! What softwares have you used in this video??

  • @oliverer3
    @oliverer3 2 роки тому

    This is so cool! I've been trying to do stuff like this but I've yet to gather all the software-related knowledge to achieve things like this.

  • @kwhp1507
    @kwhp1507 3 роки тому +1

    Where are you from? I’ve got some motors and frames I can send you.

  • @moochasas
    @moochasas 11 місяців тому

    That is an awesome project..... well done..

  • @darkonaire
    @darkonaire Рік тому

    hi nick, I love watching your videos recently I am working on a DIY beginner drone...my PID is a hybrid code of Joop and yours. My drone takes off smoothly and hovers....but after a few commands...it wobbles....which PID loop gain should i adjust? I am controlling it via keyboard...i dont have the RF controller. Please advise..Thanks

    • @NicholasRehm
      @NicholasRehm  Рік тому

      It’s hard to tell not knowing the specific implementation… what state is your pid controlling? Rate, angle, velocity? I would start lowering p and increasing d

    • @darkonaire
      @darkonaire Рік тому

      @@NicholasRehm it's a cascaded PID. Outside loop is angle, inner loop is rate...i will try experimenting on P and D gains as you suggested..many thanks.

  • @shahnejad313
    @shahnejad313 2 роки тому

    I just stumbled on this video. I am very impressed with your demonstration of DIY drone autonomous flight pattern. I am looking for similar drone navigation system without the use of GPS. I am planning to scan a cell tower using a cylenderical coordinate system using 4 ground targets as a mean to position the drone on a cylindrical grid and collect RF signals. The position accuracy could be in mm. How would you go about doing such a design? Please send your contact info. Thanks

  • @CBJamo
    @CBJamo 3 роки тому

    Excellent project, I love seeing people using ros. You should give ros2 a try, the architecture is substantially more robust.
    On to the questions:
    What was the load like on the pi?
    Why two cameras?
    You mentioned moving away from the april tags to optical object detection. Have you considered using the depth data from the realsense instead of optical?
    What is the battery connector on post at the back for?

    • @NicholasRehm
      @NicholasRehm  3 роки тому

      1. About 60% cpu load, mostly from the usb camera driver/AprilTag tracking
      2. The realsense camera is super wide angle and relatively low resolution, so its hard to remove image distortion for tracking algorithms to use. So it ended up just being easier to throw on a cheap HD usb camera for the target tracking
      3. I don't know what a more robust obstacle detection scheme would look like at the moment, but it definitely could leverage the realsense data!
      4. This quad was initially built a while back for a competition which required a disarming key X inches away from the propellers. That connector just completes the connection between the ESCs and propulsion battery with a shorted XT-60 connector

  • @dubber889
    @dubber889 2 роки тому

    Could you share the ROS source code for this ? what ros nodes and ros topics do you use. Because i'm reaserching the same area right now, it will be helpful if i know the references for what i'm doing now. Thank you

  • @blinkingspirittheblu2088
    @blinkingspirittheblu2088 Рік тому +1

    very beautiful but need faster hardware like nvidia jetson or orin...

  • @adekunleafolabi1040
    @adekunleafolabi1040 3 роки тому +1

    This is exactly what my partner and I are working on for our final year project.

  • @galileo1836
    @galileo1836 Рік тому

    what are the applications that you are using the test your drones

  • @mugslschlaengli5928
    @mugslschlaengli5928 3 роки тому

    My biggest hurdle: ROS.
    Tried getting into it on an NVidia Jetson board, and also had a Teensy 4 as my uart interface with my rover.
    Guess I have to give it another try some time. Did you come across any helpful starting resources for ROS?

    • @NicholasRehm
      @NicholasRehm  3 роки тому +1

      Starting from a raspberry pi image from ubiquity robotics with a clean install of ROS + dependencies was a lifesaver. Then going in and making your own “templates” for each node is really helpful to get an understanding of how data is handled by ROS. Once you get all the ROS topics into and out of each script, writing the code is the easy/fun part

  • @vimalrajayyappan2023
    @vimalrajayyappan2023 2 роки тому

    hey, Thats good. One small question , there is freedom of movement on top right?. Why cant we make the drone fly bit high to avoid obstacle instead going around it ? If thats the case, let me know the procedure?

    • @NicholasRehm
      @NicholasRehm  2 роки тому

      I constrained the problem to a 2D plane. What if it were in a cave and couldn’t go over?

    • @vimalrajayyappan2023
      @vimalrajayyappan2023 2 роки тому

      @@NicholasRehm Right. How to overcome that?

  • @farhanharoon6546
    @farhanharoon6546 Рік тому

    Hey Nicholas! I'm working on Motion Planning for UAVs using Motion Capture and I really need your help! Can we have a discussion please??

  • @yevhenmikhalov2258
    @yevhenmikhalov2258 Рік тому

    that was amaising, thanks!

  • @LeRainbow
    @LeRainbow 2 роки тому

    I find it insane how you made your own FC ... code and all.

  • @bodybychiken
    @bodybychiken 3 роки тому

    Is there any way to get a part list, or step by step guide?

  • @creased528
    @creased528 2 роки тому

    Hi ! This is cool and all but can I ask a question? How does your companion computer, in this case, your Rpi, communicate with the Flight Controller of the drone.

  • @STOLerant
    @STOLerant 2 роки тому

    Could this be done using an off the shelf FC or is your custom solution required?
    I think this is a great video. I have been wanting a project like this but don't know the coding side well.
    Is it possible to feed object recognition and labeling into a beta flight or Inav OSD?

    • @NicholasRehm
      @NicholasRehm  2 роки тому +1

      You'd need to do your additional computing on a companion computer and feed over control commands (same commands a pilot sends over the radio) to the flight controller. I don't know if betaflight supports this directly, but you could probably spoof it somehow

    • @STOLerant
      @STOLerant 2 роки тому

      @@NicholasRehm newest generation of Raspberry Pi products are pretty capable and easy to get a hold of. INav would probably be a better platform but any F4 or above flight controller will yield impressive results on their own. Just a matter of making the two do the talky talks. I am not sure how to do that part lol.

  • @Jp_Robotics
    @Jp_Robotics 3 роки тому +1

    Awesome ❣️🔥

  • @Elijah2Bino
    @Elijah2Bino Рік тому

    hello Sir, I'm trying to basic quadcopter with nodemcu...I need some help. Strugling with PID tuning

  • @samopal9740
    @samopal9740 3 роки тому

    Great content! Did you use only existing packages to hook up Realseans with Px4? (Is there any guid on this?) Thanks!

    • @NicholasRehm
      @NicholasRehm  3 роки тому

      No it’s all my own code

    • @samopal9740
      @samopal9740 3 роки тому

      @@NicholasRehm Is it an open source code? Thanks!

    • @NicholasRehm
      @NicholasRehm  3 роки тому

      @@samopal9740 the flight controller is but none of the autonomy

  • @__FJ__
    @__FJ__ 3 роки тому +1

    Hi! That was entertaining and very interesting. Thank you for making this video. I should put this on my pretty dumb “robotic” lawn mower.

    • @NicholasRehm
      @NicholasRehm  3 роки тому +1

      I have definitely thought of doing an automated lawnmower, haha. Thanks for the comment

    • @PuceBaboon
      @PuceBaboon 3 роки тому

      @@NicholasRehm Then you have to deal with the "negative" obstacles, too (I have a cliff-like drop off on two sides of my field). Time for a "It's the end of the world as we know it" algo. :-)

  • @matthewallen3375
    @matthewallen3375 2 роки тому

    Its a good level of weeds for me. well done video.

  • @TechMalaya
    @TechMalaya 3 роки тому

    nice bro

  • @ayuschmannov2461
    @ayuschmannov2461 2 роки тому

    Lovely lovely

  • @HclPudding
    @HclPudding Рік тому

    The drone knows where it is at all times. It knows this because it knows where it isn't.

  • @erolrecep
    @erolrecep 3 роки тому +1

    It's an awesome project! Great job! Thanks for sharing with us.

  • @theK8pTn-G
    @theK8pTn-G 2 роки тому

    Legendary effort! Very VERY cool! A great production! Remember watching Terminator as a kid? I think this was just as exciting. :D :D :D Bravo!!!!!!!! Well done!!!!!!

  • @STPromosMusic
    @STPromosMusic 3 роки тому

    I beg you, build Justin Capra's YR-VVI, that should mean you no longer need a propeller, I'd love to see a thing like that fly, even if in model version.
    If there's one person who can do it, it's you. Although it won't stop me from trying it myself in the future with my basic RC planes and drones building experience XD

  • @حسينجوادجمالفخري
    @حسينجوادجمالفخري 3 роки тому

    Love you man ❤️😘

  • @thomaslincoln401
    @thomaslincoln401 2 роки тому

    Eh it's alright!
    Just recently found this channel and going to give some of this stuff a try!

  • @technoaddicted6824
    @technoaddicted6824 Рік тому

    which simulation software is used here to replicate real drone visuals ..?

  • @eyal4
    @eyal4 2 роки тому

    What did you use to stream video and data from the raspberry pi to the computer? I'm having a problem finding a good explanation on how to implement this. Great video!

    • @NicholasRehm
      @NicholasRehm  2 роки тому

      Most of everything was happening on the pi. ROS usb_cam package to interface with the usb camera, and tag tracking, position controller, and motion planning done in individual ROS nodes too. I wrote a simple serial script in ROS to send over angle/throttle commands to the flight controller which dealt with stabilization only

    • @eyal4
      @eyal4 2 роки тому

      @@NicholasRehm thank you, but i ment what did you use to transfer live data and video feed from the raspi to the conputer that yoy watched the video on. Local wifi? Rf analog transmission? Thank you👍🏼

    • @NicholasRehm
      @NicholasRehm  2 роки тому +1

      @@eyal4 oh haha, I used a wireless hdmi set from Amazon to send the pi desktop down to a monitor. It can only do about 40 ft range but it’s awesome for pre-flight setup and monitoring stuff when flying close by

    • @eyal4
      @eyal4 2 роки тому

      @@NicholasRehm I see, and is there specifik reason that you used raspi 4 and not jetson nano? thank you very much, keep up the good work. Is
      Alwas intresting to follow your videos

    • @NicholasRehm
      @NicholasRehm  2 роки тому

      @@eyal4 mostly just because of the documentation available for setting up the pi, and it’s what I had on hand

  • @dcandteam
    @dcandteam 2 роки тому

    Amazing

  • @allyouneedtoknow8638
    @allyouneedtoknow8638 Рік тому

    px4 has the local planner module to acheive the same thing

  • @jupagu
    @jupagu Рік тому

    HELLO, CaN i find any drone whose make rutine and charge automatly??? THANKS

    • @NicholasRehm
      @NicholasRehm  Рік тому

      yEs!!

    • @jupagu
      @jupagu Рік тому

      @@NicholasRehm ok sell me one. I need a drone that make a fly each 2 hour with a routine random, and charge automatly. After that we can talk about more. THANKS

  • @phpn99
    @phpn99 10 місяців тому

    The NASA team that wrote the motion planning software for the autonomous Mars Rover, used the A* algorithm.

  • @marwinthedja5450
    @marwinthedja5450 2 роки тому +1

    Graph theory?
    Subbed!

  • @JoseRamos-su3ep
    @JoseRamos-su3ep 2 роки тому

    Really like the platform you have there, do you files you can share for that?

    • @NicholasRehm
      @NicholasRehm  2 роки тому +1

      If you shoot me an email I'll see what I can do

    • @JoseRamos-su3ep
      @JoseRamos-su3ep 2 роки тому

      @@NicholasRehm will do!!!!

    • @JoseRamos-su3ep
      @JoseRamos-su3ep 2 роки тому

      @@NicholasRehm I think I did it right, through UA-cam