Maybe you could use a PID loop to help it be more precise. It could allow it to fix the rate at which it corrects based on how far the offset is and would minimize the oscillation of the centering
Two enhancements that could make your tracking that much better: 1. A kalman filter to estimate the position of your face even in frames where it isn't being detected. 2. PID control to smooth out the motion of the drone.
Bro this 2 part was AWESOME. also when u showed distance = self - target I was like nah bruh that’s the wrong then BOOM you corrected it xD great save you’re the man
What about using some PID loop to adjust the movement, so it turns faster or raises and lowers faster if you make a big movement to keep up with you, but then fine movements when the error is small.
OMG This is Brilliant!!. The graphics in the video are awesome. The explanations are spot on and easy to understand. Jabril has completely crushed it. He's a programming genius and a master videographer. Love his story and presentation of the project. I can't wait to download his code to give it a try. By far the best programming use of the Tello drone that I've seen. At the end I wanted to stand up to give him a standing ovation. Masterful
S C I E N C E Darker skin reflects less light. Less information to process. It ain't racism, unless you're a SJW, but then noone takes you seriously anyway. An extra specialist sensor could detect things like heat etc. Maybe lidar would work.
5:43 or you can just divide the height of the blue box and length of the blue box and boom a green circle. Then signal the tello to match the coords of the red circle and the green circle...
I'm new to the field of ML, was just looking to gain some knowledge on a high level. This is the first time I'm watching one of your videos and I have it say, you did a great job explaining your approach and issues you faced while working on this project. Gives me much better clarity on things I'll have to look out for when working on something similar myself.
For the issue with standing to fast. What I would do is just determine last direction face was detected and have the drone continue going that direction within reason.
I have no idea if you read this, but I wanted to let you know that I really appreciate your video's. You seem like a good person, and it translated very well to the screen. It's fun to follow the journey you make on these projects- thanks!
i have two ideas to improve it even further: You could store the position of the last detected face for a few seconds, so it might have a better chance to keep up with you and you could use a pid controller to stir it into the final position.
Could you make it more robust simply by assuming the last known bounding box if there isn't one found on the current frame? Should be a relatively easy change, right? And one step further could have you track the bounding box corners individually and predict their movement for the next frame. That would be slightly more involved but honestly not that much. Instead of the current true, you'd give it the anticipated true.
I guess you could just do mid point dynamics. The reason I was thinking of tracking all four, however, was because it does scale. And that scale is important for the distance of the bot.
Very nice work, you should add a simple regression algorithm the tracks the center of the face (green circle) to track its trajectory and predict where it is in case it is lost. That way, if you get up too fast and out of the frame, it will know to look for you up (or in which ever direction you were going).
Jabrilsssss!!! OMG that was so awesome! I'm working on something close to this vector algorithm to try and pilot a rover on Kerbal Space Program (space simulation game). I used the same vector math to make the rover "find" an objective in game and go to it. I can give you some hints that you can program on your drone to make it smoother too! First, try to implement a PID controller to the turning control, it will try to get to the 0 value more smoothly. I have my code, but it's in Python and in portuguese so I think it'll be hard to understand, but I can share it with you, it's on my github, under the MechPeste folder, named ControlePID and RoverPST. Another hint, try to make the drone record some frames, or at least the position of the distance vector like a list, so if the subject exits its field of view too quickly it still will be able to interpolate the latest values! Oh man, I don't know how I can help you with this but I felt the same excitement when the little drone started to follow you xD! Next up, try to make a RC car drive on you corridor without hitting the walls with some IR laser distance measure, I don't know. hahaha that looks so fun!
Awesome video Jabrils, feels good when the hard work pays off. A couple of ideas taken from the world of trackers. There is an idea for radars called 'track coasting'. Whenever the radar looses the track in its scan, it can set a coasting flag high to continue the track movement at the last known vector until it hopefully regains track. This could help with the issues of the drone losing your face when you quickly leave frame. Not sure if you already have this programmed in, but might be an interesting thing to try out.
@@SH-hp2nu That's a good and reasonable question. Black developers are hard to come by. Seeing him is a reminder that minorities are participating more in these exciting new fields. And don't get me wrong, white developers are okay by me. But it's good to see someone like me doing something like this. There are all sorts of subtle stereotypes you have to be black to appreciate so it's pretty darn awesome to see him do his thing regardless. One day saying "black developer" will be like saying, "tall developer", at which time I'll be glad the ethnicity doesn't mean anything. But this is 2019, and sadly, biases still exist. It's okay if you ultimately don't agree with everything I wrote btw (although, hopefully you'll find my perspective to be quite reasonable). Best.
Someone already suggested it, but let me stress how a PID control would mean insta-improvement in terms of auto-alignment. I think what you already implemented is some kind of proportional (P) control, and the lack of the other two branches causes the drone to oscillate around the desired position.
what you can do to make it more snappy is to tweak the motor's speed to be related to the distance vector's length so when you stand up fast it will do better at going up fast enough to not lose the facial box off the screen and if the distance vector is short it won't swing as bad if it's setup this way
Hi Jabrils, have you tryed making 2 bounding box trackers : A smaller one and a bigger one such that the algorithm has more chances to get it right. Once in Uni we also did a follower drone that chased a circle and we never managed to make everything work smoothly until we made multiple circles on the paper so even if the drone did not always find the big circle it would find the other circumscripted circles.
That's pretty cool. I've been thinking about this since I watched blade runner 2049 in the scene at the beginning of the movie. Looks like we're getting there.
Had you ever considered using PID during development? It would have worked a lot better than the safety zone (even tho that was very smooth), and you can probably speed up the corrections that it does by a lot.
I wonder if you'd use an Infrared camera + a normal camera on a drone, and using the different data provided from both would make things easier with tracking a target. Like it'd be able to identify the face/humany body via infrared due to the visible heat difference from the environment+it should consider the shape too otherwise it'd easily be distracted by other things like a radiator.Then you'd have to program it to look for a face, which then the camera would scan and check for facial features, and be trained to stick to certain face. This way you could have observation drones seeking for targets flying around in the public, checking peoples faces, and if the facial features don't match they move to the next infrared target
If you change the radio of the green circle with the size of the face box, you can control how the dron can stay you nearest or farest from you, with the radio red circle constant. Because I dont know how you control the dron to go close to you. Awesome video
I really like this video! It's really amazing to see something like this able to be done! I'm learning HTML at the moment, and came across your video! BTW you have to release a song called 'fly little drone' and you should record a video using your drone lol that would be epic
have you tried P or PID controller? it will stop oscillating and adjust much faster. but i guess its not processing fast enough to make it work. other color spaces like HSV or gray scale might speed it up but not sure if face recognition will work then...
I'm a year late on this but mad RESPECT to you Jabrils. your content is honestly such a huge motivation to keep going. Also, when is that song dropping on Spotify!!
Awesome video but could you explain (or someone else in the comments) how is actually the A.I. working when you are outside? (you explained in the first part how you did the training with the wifi and the ethernet connection) Do you have your laptop in your backpack and is connected to the IBM servers trough 4g? Do you have a trained model in your laptop and the facial recognition ist done in your laptop locally? Thanks!
Cool project! I work on an automated landing system for a drone based on a camera and a landing symbol. Therefore I'm using two PD controllers for the horizontal plane (x and y) and a PID controller for the height (z). Generally it's pretty awesome when the drone starts to do what it should!
How about using something like Kalman filtering to trace the movement of the face target vector? That way, if you move quickly, it'll follow the path closely and eventually finding the target again
Or maybe implement a fuzzy control for the drone? Before the bounding box spans out of the camera space, you'd have an approximate direction and so it'd try to pace itself in that direction depending upon the acceleration of the target vector
If you wanted to you could probably train the AI to make more smooth moves by using this system to capture data, interpolate more smooth movements, then train a network that takes your vector inputs and outputs the smoothed moves. This same technique could also be used to reduce over/undercompensation, if you could label that in the data.
PixeLabor I’m guessing the drone library implements a pid or some sort of model based controller. I think he’s referring a smoother method of waypoint selection
Do you ever talk through your code solutions? Like is there any way we could see what your solution looked like, how you gained access to communicating with the drone, that sort of stuff?
Wow man thats crazy! I went through the same obstacles when creating a robot head that tracks your face! Even when converting the video feed, we had to use an NDI camera and the feeds were not compatible, coversting the video was such a pain in the ass... I love your videos man, keep at it!
There are quite a few Mobilnet implementations available online of facial tracking or even full body tracking. They would have been just as easy as the OpenCv one, but with way better results
Code Bullet Brought me to your Ch - So glad he did.. Loving your content and look forward to going back through your Vids..
5 років тому+1
We are always loving your feed!! Please keep making this kind of videos and thank you once again for sharing!! Can you please start tutorial series for machine learning!!?
Yo my man! You got me into this shit! Thank you! Also, this vedeo is absolutely hilarious but scary at the same time. Just seeing how a machine just follows an algorithm regardless is scary af
You could have implemented some continuous control, for example PID, instead of the safety zone. That would have solved the wobbling problem while beeing more accurate and it would also follow your face faster!
Heres a question, could i switch from a facial recognition cascade to a vehicle cascade? Maybe i could release this drone and then, perhaps select a vehicle, then it would track that vehicle and fly to it? I mean, just to have it say "hi" to that vehicle. I wouldnt want anyone to think i was up to no good.
I worked on similar project only with raspberry pi driven rc car which detected road lanes and drove autonomously. Communication through wifi is really slow and probably the only thing that sets limits on creating great technology. Ofc there are other ways to transfer data wirelessly , but not nearly as simple as wifi.
Its a bit late for me to comment this, but if you do see it I recommend creating a box around the center or 0, 0, 0 that the drone tries to stay in instead of a point. This is so it doesn't sway constantly. You could also expand on this idea by adding more then one box that tell the drone to adjust its position more sharply the further away it is.
Next step would be head detection rather than face detection so you can walk away and it still follow you. Also, if you move too quickly and escape the boundary box, it should continue in the direction of the last known location for x amount of time as a last attempt to find the target.
Could you get it to follow an inanimate object? Like your hat for example- Light blue and easy to follow, just set max hover height a bit lower and it does the job fine.. right?
you can just make the drone as a transciever and there's a server somewhere that controls the drone and its AI and add pid controllers to center while moving
Good job. You could use OpenPose instead to be able to track the entire body and guess the position of the nose even when it gets out of the scene. You can even use some posture commands to make the drone do things on demand like move the drone, lock distance, or take a picture ... I also suggest using a PID controller to make the tracking smoother. Keep the good work.
That was so cool. How about combining it with the AI program that you made to get Forrest to go round courses, and get it to learn in door and outdoor obstacle courses.
Drones that recognize faces are pretty cool but drones that attack on facial recognition are really the next step.
thats a very sad comment im depressed
Be careful what you wish for
I dont think it would be hard to modify current code. makes me wonder of IBM plans long term
Watch Michael reeves's video 😂
Just wear a mask
14:04 - 14:40 - just a guy and his drone :)
Oh, you here? I realy liked watching your series on computer science.
I like your series on philosophy. Now I can think gooder!
hi i am a fan
You are very close to Michel Reeves drone that flies into people's faces, But now you can recognize the fear in their faces.
17:52 *A black developer found dead after he accidentally created a racist AI drone*
LITERAKKY
Can you and Code Bullet do a collab? I would love to see my two favorite programmers work together.
Like if you want them to collab.
IIRC he already did some small colab there. It would be awesome to see more, though
micheal reeves: am i a joke to you
370th, oh yeah
Maybe you could use a PID loop to help it be more precise. It could allow it to fix the rate at which it corrects based on how far the offset is and would minimize the oscillation of the centering
Two enhancements that could make your tracking that much better:
1. A kalman filter to estimate the position of your face even in frames where it isn't being detected.
2. PID control to smooth out the motion of the drone.
Bro this 2 part was AWESOME. also when u showed distance = self - target I was like nah bruh that’s the wrong then BOOM you corrected it xD great save you’re the man
I made an ai that loves me
Kong impossible...
That must be a bug
I already made that 10 yrs ago
Source code plz
print("I love Kong")
What about using some PID loop to adjust the movement, so it turns faster or raises and lowers faster if you make a big movement to keep up with you, but then fine movements when the error is small.
OMG This is Brilliant!!. The graphics in the video are awesome. The explanations are spot on and easy to understand. Jabril has completely crushed it. He's a programming genius and a master videographer. Love his story and presentation of the project. I can't wait to download his code to give it a try. By far the best programming use of the Tello drone that I've seen. At the end I wanted to stand up to give him a standing ovation. Masterful
Thank you for the open source Jabril! You made my day seriously
What happens when there's mirrors ? Mmmm
It would probably think you're really far away and crash straight into the mirror as it tried to "get closer"
@@harithshah5233 yes lol
I'd imagine it would get stuck at a point where it began to block the subject
Thats how we will fight it when it reaches singularity
This whole video was like watching Iron Man test out his suit for the first time
"Yeah, it definitely works better on lighter skin faces"
*_R A C I S M_*
S C I E N C E
Darker skin reflects less light. Less information to process. It ain't racism, unless you're a SJW, but then noone takes you seriously anyway.
An extra specialist sensor could detect things like heat etc. Maybe lidar would work.
@HDP Wedzz Yeah, go back to Reddit. At least memes such as "R/WHOOSH" used in an incorrect way will be appreciated by the small souls over there.
@@Lunsterful yo it was sarcasm
@@Lunsterful Is there an r/doublewooosh? because you would definitely fit the bill for something like that.
@@Lunsterful True, you would also probably fit the bill for that.
get a drone with two camera lenses so it can 3d map the target for greater accuracyy :)
Overkill. Pun intensifies.
I liked the video purely because you said "bags and baguettes"
5:43 or you can just divide the height of the blue box and length of the blue box and boom a green circle. Then signal the tello to match the coords of the red circle and the green circle...
I'm new to the field of ML, was just looking to gain some knowledge on a high level. This is the first time I'm watching one of your videos and I have it say, you did a great job explaining your approach and issues you faced while working on this project.
Gives me much better clarity on things I'll have to look out for when working on something similar myself.
For the issue with standing to fast. What I would do is just determine last direction face was detected and have the drone continue going that direction within reason.
I have no idea if you read this, but I wanted to let you know that I really appreciate your video's. You seem like a good person, and it translated very well to the screen. It's fun to follow the journey you make on these projects- thanks!
Damn,this is so informative to watch,but so fun!!! Unique content!
Also..people like you inspire me to learn Python
P.S. thanks for the heart
This was really exciting to watch! Whenever i watch your videos i just get inspired. Thank you and keep up the good work!
You could use half of that distance vector so it wouldn't overshoot and would slowly (when close to target vector) approach the desired position
i have two ideas to improve it even further: You could store the position of the last detected face for a few seconds, so it might have a better chance to keep up with you and you could use a pid controller to stir it into the final position.
Could you make it more robust simply by assuming the last known bounding box if there isn't one found on the current frame? Should be a relatively easy change, right?
And one step further could have you track the bounding box corners individually and predict their movement for the next frame. That would be slightly more involved but honestly not that much. Instead of the current true, you'd give it the anticipated true.
It doesn't rotate so you don't have to track them individually
I guess you could just do mid point dynamics. The reason I was thinking of tracking all four, however, was because it does scale. And that scale is important for the distance of the bot.
You're basically describing a kalman filter.
It has no memory and only knows it's position to the face. If the face is lost it is completely blind.
Very nice work, you should add a simple regression algorithm the tracks the center of the face (green circle) to track its trajectory and predict where it is in case it is lost. That way, if you get up too fast and out of the frame, it will know to look for you up (or in which ever direction you were going).
Jabrilsssss!!! OMG that was so awesome! I'm working on something close to this vector algorithm to try and pilot a rover on Kerbal Space Program (space simulation game).
I used the same vector math to make the rover "find" an objective in game and go to it.
I can give you some hints that you can program on your drone to make it smoother too! First, try to implement a PID controller to the turning control, it will try to get to the 0 value more smoothly.
I have my code, but it's in Python and in portuguese so I think it'll be hard to understand, but I can share it with you, it's on my github, under the MechPeste folder, named ControlePID and RoverPST.
Another hint, try to make the drone record some frames, or at least the position of the distance vector like a list, so if the subject exits its field of view too quickly it still will be able to interpolate the latest values!
Oh man, I don't know how I can help you with this but I felt the same excitement when the little drone started to follow you xD!
Next up, try to make a RC car drive on you corridor without hitting the walls with some IR laser distance measure, I don't know. hahaha that looks so fun!
you could use raycast to determine if there was a wall in front of the ai or not. This is better in some points
caralho tio tu ta em todo os lugares no youtube hein
where can i learn to do these things? :0
Awesome video Jabrils, feels good when the hard work pays off.
A couple of ideas taken from the world of trackers. There is an idea for radars called 'track coasting'. Whenever the radar looses the track in its scan, it can set a coasting flag high to continue the track movement at the last known vector until it hopefully regains track. This could help with the issues of the drone losing your face when you quickly leave frame. Not sure if you already have this programmed in, but might be an interesting thing to try out.
probably the best music video to date
As a black developer, it makes me happy to see Jabrils doing his thing 😊
Float Circuit what does race have to do with anything
@@SH-hp2nu That's a good and reasonable question.
Black developers are hard to come by. Seeing him is a reminder that minorities are participating more in these exciting new fields. And don't get me wrong, white developers are okay by me. But it's good to see someone like me doing something like this. There are all sorts of subtle stereotypes you have to be black to appreciate so it's pretty darn awesome to see him do his thing regardless.
One day saying "black developer" will be like saying, "tall developer", at which time I'll be glad the ethnicity doesn't mean anything. But this is 2019, and sadly, biases still exist.
It's okay if you ultimately don't agree with everything I wrote btw (although, hopefully you'll find my perspective to be quite reasonable).
Best.
Float Circuit yeah I understand, I thought you were one of those anti race people lol
As a white developer, it makes me happy to see Jabrils doing his thing 😊
@@jordanski5421 Well said.
Someone already suggested it, but let me stress how a PID control would mean insta-improvement in terms of auto-alignment.
I think what you already implemented is some kind of proportional (P) control, and the lack of the other two branches causes the drone to oscillate around the desired position.
The Chao garden music in the background is very much appreciated!!!
what you can do to make it more snappy is to tweak the motor's speed to be related to the distance vector's length
so when you stand up fast it will do better at going up fast enough to not lose the facial box off the screen and if the distance vector is short it won't swing as bad if it's setup this way
Man @Jabrils, you changed my life.
Never let perfect get in the way of good enough
thanks for showing your process, very interesting
i liked when you said "this is autonomy right here"
I know this video is old, but your background music
Amazing project jabrils. Inspired me a lot
Hi Jabrils,
have you tryed making 2 bounding box trackers : A smaller one and a bigger one such that the algorithm has more chances to get it right. Once in Uni we also did a follower drone that chased a circle and we never managed to make everything work smoothly until we made multiple circles on the paper so even if the drone did not always find the big circle it would find the other circumscripted circles.
You have the best narrative techniques out there.
That's pretty cool. I've been thinking about this since I watched blade runner 2049 in the scene at the beginning of the movie. Looks like we're getting there.
congratulations my friend, I'm happy that you made it.
Had you ever considered using PID during development? It would have worked a lot better than the safety zone (even tho that was very smooth), and you can probably speed up the corrections that it does by a lot.
Jabrils my g ive been 2 years in compsci college and you manage to inspire me every time, more than my own professors! Keep up the good work
I wonder if you'd use an Infrared camera + a normal camera on a drone, and using the different data provided from both would make things easier with tracking a target. Like it'd be able to identify the face/humany body via infrared due to the visible heat difference from the environment+it should consider the shape too otherwise it'd easily be distracted by other things like a radiator.Then you'd have to program it to look for a face, which then the camera would scan and check for facial features, and be trained to stick to certain face. This way you could have observation drones seeking for targets flying around in the public, checking peoples faces, and if the facial features don't match they move to the next infrared target
If you change the radio of the green circle with the size of the face box, you can control how the dron can stay you nearest or farest from you, with the radio red circle constant. Because I dont know how you control the dron to go close to you. Awesome video
I really like this video! It's really amazing to see something like this able to be done! I'm learning HTML at the moment, and came across your video! BTW you have to release a song called 'fly little drone' and you should record a video using your drone lol that would be epic
Well done! If you want to give yourself another challenge you should see if you can use PID control for your future projects
have you tried P or PID controller? it will stop oscillating and adjust much faster. but i guess its not processing fast enough to make it work. other color spaces like HSV or gray scale might speed it up but not sure if face recognition will work then...
I’m impressed at what you got done with limited resources! 👏🏻👏🏻👏🏻
I love the sonic chows background sound at the end
I hope IBM gave you more than a drone for doing all of that work
Dude, your're awesome! Love your videos as learning content AND as super interesting content! Congrats for your brain and charisma!
It feels so satisfying to see you complete this, thank you.
is there any way to get the latency to be shorter, so that it responds faster? it would be really cool to be able to have it track you while running
I'm a year late on this but mad RESPECT to you Jabrils. your content is honestly such a huge motivation to keep going. Also, when is that song dropping on Spotify!!
this is so amazing and scary so how. you are so talented dude
Perhaps your best video yet
your looking at camera doing weirdo faces TRIGGERS ME SO MUCH
this is how devs have fun! nice stuff bro, keep inspiring.
Man, do I love the content and presentation style of your videos. Gonna say it time and time again... :D
I'm by far not smart enough to understand what exactly you do, but it sure is fun to watch!
Such a chilled dude :D
Awesome video but could you explain (or someone else in the comments) how is actually the A.I. working when you are outside? (you explained in the first part how you did the training with the wifi and the ethernet connection) Do you have your laptop in your backpack and is connected to the IBM servers trough 4g? Do you have a trained model in your laptop and the facial recognition ist done in your laptop locally?
Thanks!
Cool project! I work on an automated landing system for a drone based on a camera and a landing symbol. Therefore I'm using two PD controllers for the horizontal plane (x and y) and a PID controller for the height (z). Generally it's pretty awesome when the drone starts to do what it should!
very nice, a simple but stable solution that's easily expandable.
How about using something like Kalman filtering to trace the movement of the face target vector?
That way, if you move quickly, it'll follow the path closely and eventually finding the target again
Or maybe implement a fuzzy control for the drone?
Before the bounding box spans out of the camera space, you'd have an approximate direction and so it'd try to pace itself in that direction depending upon the acceleration of the target vector
Man I love these videos! Keep making more and I’ll watch ALL of em.
Best music video of 2019!
If you wanted to you could probably train the AI to make more smooth moves by using this system to capture data, interpolate more smooth movements, then train a network that takes your vector inputs and outputs the smoothed moves. This same technique could also be used to reduce over/undercompensation, if you could label that in the data.
just use pid. lil bit more simple
PixeLabor I’m guessing the drone library implements a pid or some sort of model based controller. I think he’s referring a smoother method of waypoint selection
@@rahulbball9395 i shortly watched the libary and could not finde something. maby i oversaw. maybe the drone even has it build in itself
Dude, I love how you include all the hiccups along the way! And so entertaining as usual!!!!!!!!!
That’s dope, nice work! 🙌🏼😮
Do you ever talk through your code solutions? Like is there any way we could see what your solution looked like, how you gained access to communicating with the drone, that sort of stuff?
Wow man thats crazy! I went through the same obstacles when creating a robot head that tracks your face!
Even when converting the video feed, we had to use an NDI camera and the feeds were not compatible, coversting the video was such a pain in the ass...
I love your videos man, keep at it!
There are quite a few Mobilnet implementations available online of facial tracking or even full body tracking. They would have been just as easy as the OpenCv one, but with way better results
Code Bullet
Brought me to your Ch - So glad he did.. Loving your content and look forward to going back through your Vids..
We are always loving your feed!!
Please keep making this kind of videos and thank you once again for sharing!!
Can you please start tutorial series for machine learning!!?
So dope, so very dope. The 21st century is wild
Yo my man! You got me into this shit! Thank you! Also, this vedeo is absolutely hilarious but scary at the same time. Just seeing how a machine just follows an algorithm regardless is scary af
You could have implemented some continuous control, for example PID, instead of the safety zone. That would have solved the wobbling problem while beeing more accurate and it would also follow your face faster!
Heres a question, could i switch from a facial recognition cascade to a vehicle cascade? Maybe i could release this drone and then, perhaps select a vehicle, then it would track that vehicle and fly to it? I mean, just to have it say "hi" to that vehicle. I wouldnt want anyone to think i was up to no good.
I worked on similar project only with raspberry pi driven rc car which detected road lanes and drove autonomously. Communication through wifi is really slow and probably the only thing that sets limits on creating great technology. Ofc there are other ways to transfer data wirelessly , but not nearly as simple as wifi.
Its a bit late for me to comment this, but if you do see it I recommend creating a box around the center or 0, 0, 0 that the drone tries to stay in instead of a point. This is so it doesn't sway constantly. You could also expand on this idea by adding more then one box that tell the drone to adjust its position more sharply the further away it is.
You honestly deserve more subs.
I love your videos and your antics, Jabrils! xD
Next step would be head detection rather than face detection so you can walk away and it still follow you. Also, if you move too quickly and escape the boundary box, it should continue in the direction of the last known location for x amount of time as a last attempt to find the target.
You could try and implement a PID-Controller to regulate the swing motions but yeah that would then not be really AI I think.
Could you get it to follow an inanimate object? Like your hat for example- Light blue and easy to follow, just set max hover height a bit lower and it does the job fine.. right?
you can just make the drone as a transciever and there's a server somewhere that controls the drone and its AI
and add pid controllers to center while moving
Good job.
You could use OpenPose instead to be able to track the entire body and guess the position of the nose even when it gets out of the scene.
You can even use some posture commands to make the drone do things on demand like move the drone, lock distance, or take a picture ...
I also suggest using a PID controller to make the tracking smoother.
Keep the good work.
Boi, you are the single coolest nerd walking under the sun.
🤓🤓🤓🤓🤓🤓🤓🤓
*Excitement clouds logic of even the best of developers*
A programmer + funny boi = perfect
Sounds like the start of an Avril Levine song haha
- Jabril
Add 6 sensors, photomodel your home using a couple hundred DSLR pics and you can train the lil' bugger virtually. GIMME ONE OF THOSE DRONES! GIMME!
that end rap is pure fire
Loved it. Really inspiring for yougn ones who are learning machine learning.
wow dude you deserve all youtube subscribers
Hi. cool video. Is the inference done on the drone or done remotely on the computer? The drone only the video source?
I am disapointed that I am just learning about your channel. Your style is amazing!
That was so cool. How about combining it with the AI program that you made to get Forrest to go round courses, and get it to learn in door and outdoor obstacle courses.
add a PID controller between your distance vector and your target, drone would react faster(depending on your constants)