This Motorised Mechanical Eye Ball is build with AI (p2)

Поділитися
Вставка
  • Опубліковано 2 кві 2022
  • This Robotic Eye works!!! It uses Jetson and ODrive, with some powerful motors. I have encountered many issues with this build, and at some point I thought that it is never going to work. But finally it works really great! Also I have included Jetson AGX Orin unboxing :) .
    Special thanks to my special Patrons: Shounak Bhattacharya and M. Aali!
    Please subscribe. This will help me to develop other projects like this, to bring the bright future closer!
    One time donation:
    www.paypal.me/Skyentific
    If you want to help this channel, please support me on Patreon:
    / skyentific
    Instagram: / skyentificinsta
    Facebook: / skyentificface
    Twitter: / skyentifictweet
    #DIY #robot #AI
  • Наука та технологія

КОМЕНТАРІ • 115

  • @Skyentific
    @Skyentific  2 роки тому +55

    I need your like! Please help me. The more likes I will have, the more views will be. The more views gives more subscribers. The more subscribers we will have, the more videos I will make.

    • @Nicholas-my4xj
      @Nicholas-my4xj 2 роки тому +1

      Looks great, good work as always!

    • @DanielIzguerra2012
      @DanielIzguerra2012 2 роки тому

      Any update on the brushless motor arm?

    • @MagicGumable
      @MagicGumable 2 роки тому

      Could you try to shield the cable with simple tinfoil connected to ground? It may be not as elegant but dirt cheap ;)

    • @tyeth
      @tyeth 2 роки тому

      You should stick affiliate links to certain featured things, like the CSI to HDMI adapters. Also would be nice to see it look for face-less people (i.e. back of head turned to camera or face obscured), and to have a slow roaming mode if no faces, probably similar to some film :)
      Keep up the good work!

  • @Corbald
    @Corbald 2 роки тому +7

    Maybe it's the yellow color, but this fits into 'Creepy-Cute' to me. It seems _so happy_ to see you!

  • @WalkingEng
    @WalkingEng Рік тому

    This is just brilliant, probably one of the best robot applications yet

  • @freakinccdevilleiv380
    @freakinccdevilleiv380 2 роки тому +5

    Awesome and funny, good job Sky. It would be a nice exercise to go to the other extreme and try to do it as cheaply as possible.

  • @marcusluis_s
    @marcusluis_s 2 роки тому +2

    Great job @Skyentific 👏

  • @DanielIzguerra2012
    @DanielIzguerra2012 2 роки тому +11

    Amazing, thank you for your videos! You inspire us

  • @knoopx
    @knoopx 2 роки тому +1

    hahaha cool glasses and nice workout! and awesome project of course!

  • @dfn808
    @dfn808 2 роки тому

    This is an awesome project. Thank you for taking the time to share and explain everything clearly.

  • @JaySmith91
    @JaySmith91 Рік тому

    This is fantastic; thanks for sharing the process.

  • @organicelectrics
    @organicelectrics 2 роки тому +1

    So cool! Love the fluid motion with the higher frame rate.

  • @Dangineering
    @Dangineering 2 роки тому +2

    Incredible project! Thank you for sharing!

    • @Skyentific
      @Skyentific  2 роки тому +1

      Thank you for watching my videos and for this comment!

  • @joels7605
    @joels7605 2 роки тому +2

    Spectacular work. I love it. Well done, sir.

  • @JohnDuthie
    @JohnDuthie 2 роки тому +3

    I love how the motors react at the framerate of the Jetson Nano. It's odd but cool to see how everything is dependent on the vision.

  • @vishalsingh-yf9es
    @vishalsingh-yf9es 2 роки тому

    You'r are the most inspiring person when it comes to ROBOTICS, God bless you man :)

  • @josgraha
    @josgraha Рік тому +2

    I love Mr. Bruton but I think you are the engineer in the room here. :). Thanks again for your video sir! BTW, it would be interesting to see you use the ROS2 IK solvers for some of these geometric challenges. Can't wait for you to release more videos, I always learn a lot from you and I think I have more fun than you (possibly) seeing all these very cool projects you share with us.

  • @jrohit1110
    @jrohit1110 2 роки тому

    Those shades are badass!

  • @ericcarabetta1161
    @ericcarabetta1161 2 роки тому +7

    This is so cool, I'd love to be able to implement something like this in an art project I had in mind.

    • @Inertia888
      @Inertia888 2 роки тому +1

      it reminds of the laser guided systems that fighter jets use

  • @imadjawad4408
    @imadjawad4408 Рік тому

    outstanding videos every time I watch, A+

  • @nathaniellangston5130
    @nathaniellangston5130 2 роки тому

    Great Video! I am REALLY impressed with how fast the fancy pants Jetson was able to track you! I too have a hard time doing anything non-overkill!

  • @JuanCarlos-ff2rh
    @JuanCarlos-ff2rh 2 роки тому

    MARAVILLOSO!!! Voy a hacer como este pero con Windows. Tus videos me encantan. Muchas gracias

  • @charlesb689
    @charlesb689 2 роки тому

    Amazing video!

  • @sash710
    @sash710 2 роки тому +1

    WOW 100% Cool

  • @PhG1961
    @PhG1961 2 роки тому +3

    Awesome !! Of course I hit the thumbs up button ! I always do !

  • @danielghani3903
    @danielghani3903 2 роки тому

    thank you

  • @plotze0692
    @plotze0692 2 роки тому +1

    Very cool project!

  • @worksasintended4997
    @worksasintended4997 2 роки тому

    That thing is smooth! Great work, great to see it done.
    I need to build something like that and add a water hose at the top. It will be hilarious fun with the children.

  • @taitywaity1836
    @taitywaity1836 2 роки тому

    keep making these! cool socks btw

  • @EatRawGarlic
    @EatRawGarlic 2 роки тому +5

    Very cool! I did something similar with a Pi4 to kill some time during the lockdown, but soon ran into its processing limitations.
    What I did manage to do, was make it play the Metal Gear warning sound upon recognition of a face :D

    • @wetfish412
      @wetfish412 2 роки тому

      I have dyslexia, I read that as to kill someone.

  • @user-jl3ti3tc2j
    @user-jl3ti3tc2j 2 роки тому +2

    Use the quaternions! Look for explaining at the 3BlueDotBrown channel.

    • @Skyentific
      @Skyentific  2 роки тому

      Great video, thank you for advise.

  • @Muny
    @Muny 2 роки тому +4

    I'm not sure I follow the formula you created for figuring out the angle you need to point at. I did a very similar thing recently, and I just calculated the Angle per Pixel for horizontal and vertical (for my particular lens+sensor) and multiplied the pixel error by that to get the change in angle necessary. (HFOV/ImageWidth) for horizontal, (VFOV/ImageHeight) for vertical.

  • @thunderinvader9031
    @thunderinvader9031 2 роки тому

    It was fun eyeballing )

  • @buidelrat132
    @buidelrat132 2 роки тому +2

    Great job! Love the analytical solution. Could quaternions have simplified the math?

  • @tszulpinedo757
    @tszulpinedo757 2 роки тому

    Están bien cool sus lentes, profesor...

  • @Tetsujinfr
    @Tetsujinfr 2 роки тому +3

    Really cool project, congrats! One note about Nano vs NX use: if you wanted a more fluid set of motions you could just keep using Nano and add some smoothing interpolation algo (local cubic spline for instance) on top of the raw points and read the interpolated points and extrapolate as well, at say 50hz, it should work really well. Now if you wanted to reduce the eye tracking latency, then you indeed have no choice but to reduce the initial compute time to start moving asap to track the target.

    • @frollard
      @frollard 2 роки тому

      Now we just need another ai layer of projecting where it thinks the target will be; there will still be 70ms latency to first move, but once moving the targeting solution could be a look-ahead.

    • @Tetsujinfr
      @Tetsujinfr 2 роки тому

      @@frollard good point, with a pose detector there is a lot of info to use to forecast the head position, but deep learning is compute expensive for those little embedded computers.

  • @MJ12GRAVITON
    @MJ12GRAVITON Рік тому

    Amazing! 👁👁👁👀👀👀

  • @AtTheZebo
    @AtTheZebo 2 роки тому

    Give it the body of a "minion" and teach it how to multiply itself!

  • @beefsand419
    @beefsand419 2 роки тому

    Nice

  • @CyberSyntek
    @CyberSyntek 2 роки тому +3

    Please give it a linear microphone array to detect which direction sound is coming from also! 🙏

    • @Skyentific
      @Skyentific  2 роки тому +1

      Great idea! And I have one :)

    • @CyberSyntek
      @CyberSyntek 2 роки тому

      @@Skyentific if you get that working your already existing legend status will grow even beyond what it already is in my mind! :)
      That is actually something I really wanted to play with and get working with a combo of filters limiting the sound response signals within a sane range so the bot or in this case eye doesn't have a melt down and play with stop/start responses between the linear mic array and openCV so that we have seeing and hearing robots available to the open source community. Life just became a bit complicated recently and I haven't found the time yet to play around with that or much at all. Hopefully that will change near future, but I believe you are a much more qualified man for the job than myself honestly.
      Beyond that stuff, mannn love the response times you are getting on this eye. Very very nice. What a cool combo to have using the odrive. Amazing stuff.

    • @CyberSyntek
      @CyberSyntek 2 роки тому

      Just as example of what I mean.
      If sound range is within (whatever range the filters are set) stop/disable openCV > moveTo(where ever sound was) >start.OpenCV()
      Though I suppose we would want to take it a step further with some if statements so that if the sound detected is still within the range/direction of where it is currently looking (if the person speaking is already being detected while speaking) that it wouldn't disable openCV since it may cause some strange movement patterns between words. 🤔
      I'm thinking more towards a 6 mic linear array as that would likely make things easier to work with.

  • @user-xx3lj2ul3c
    @user-xx3lj2ul3c 2 роки тому

    Your always make cool video !!!!!!

  • @robottinkeracademy
    @robottinkeracademy 2 роки тому +3

    Awesome work, next step an end effector to slap anyone that isn't you that the robot sees 😀

  • @manyirons
    @manyirons 2 роки тому

    Cool! Now see if you can make it track a fly.

  • @BrainSlugs83
    @BrainSlugs83 2 роки тому

    Very cool project. -- I would have stayed with the PID loop (a full PID loop though), the little overshoots the eyeball does could be overcome by tuning the Ki and Kd parameters.

  • @baxter1484
    @baxter1484 Рік тому +1

    this is just the portal rocket sentry core

  • @lionelheavener3396
    @lionelheavener3396 2 роки тому

    imagine seeing this in a large animatronic

  • @jacquesb5248
    @jacquesb5248 2 роки тому +4

    dude you brilliant. if there was two faces, which one would it follow?

    • @Skyentific
      @Skyentific  2 роки тому +2

      Great question! It is programmed to follow the biggest (yet closest) face. But I have not tested this yet :)

  • @Wyld1one
    @Wyld1one Рік тому

    What would happen if you put a picture of yourself on the wall behind you?
    Also noticed something interesting. On large movements it looks like it's overcompensating a bit on the amounted moves and then it backs up a bit. I wonder if it would be better if it's had a large distance to move it would move say 90% of it and then slow move to the smaller distance?
    I was also watching a video they're talking about the neurology of how we read. An interesting bit they were showing was how the eye moves. It tends to jump or snap to a New direction. Could probably do that because it's lightweight to move

  • @MYouMusikTV
    @MYouMusikTV 2 роки тому +1

    Will it work to shield the old camera cable by a pice of grounded aluminum foil which is wrapped around the flat cable?

  • @riccardoberra9476
    @riccardoberra9476 2 роки тому +2

    Very impressive, i’m doing a really similar project (a robot with the same principes of movement but with the camera there’s also an electric airsoft carriage for shooting drones 😂, detection, tracking and shooting )
    Your problem solving and the solutions you took inspired me a lot. Nice job!!

    • @user-qy9rg3nt2l
      @user-qy9rg3nt2l 2 роки тому +1

      I have a similar project going on too. Fully auto Airsoft sentry.

  • @iloverobotics113
    @iloverobotics113 2 роки тому

    Ya. Really cool!!!! This is the eye of God.

  • @TheNadOby
    @TheNadOby 2 роки тому

    Nice project, with great color scheme, but the computation requrements seems to big.
    Have you tried something old-fashioned, like OpenCV?

  • @rextalon7763
    @rextalon7763 2 роки тому +2

    As a next step, just for interesting cosmetic reasons, maybe put in a range finder and an iris. When the detected object is far away open the iris, and as the object gets closer, have the iris close tighter. (or the other way around, idk)

  • @fischX
    @fischX 2 роки тому

    Could be fun to mount this on a camera crane

  • @frollard
    @frollard 2 роки тому

    I wonder if rate limiting the odrive would reduce the jerkiness of the motion from the nano; the robot could get to the destination too fast for 1/15s to pass for the new video frame. An interpolation/planner layer between input and output could send high frequency odrive instructions from low frequency input coordinates and still maintain very fast reactions.

    • @ChrisSivanich
      @ChrisSivanich 2 роки тому

      PID (Proportional Integral Derivative) algorithms can do the interpolation you're thinking of. They can adjust motor power at a much higher frequency than the target position input and therefore provide smooth transition to any target while accounting for over/undershoot. They're extremely useful when doing relative motion change with encoders in limited compute situations, especially with low frequency + small movements.
      They're very cheap performance-wise too, my PID implementation (which is in no way optimal) can easily control 3 motors in under 1ms of compute with only an M0 equipped Arduino (no image compute/tracking in my case, just targeted positioning).

  • @Idlecodex
    @Idlecodex 2 роки тому

    Did you consider using an slip ring instead?

  • @alexdorand
    @alexdorand 3 місяці тому

    do you have or sell or offer "how to build ....?" of your robots and projects?

  • @dawitsarsenbaev2333
    @dawitsarsenbaev2333 2 роки тому

    Молодец братан

  • @regularfryt
    @regularfryt 2 роки тому +1

    I do wonder how much of the need for a more powerful Jetson is because you're doing full pose estimation (expensive) rather than face location (which is comparatively cheap). You could probably get a *much* higher frame rate using something like YOLO, and *that* would let you use something far smaller and cooler to drive the motion.

    • @Skyentific
      @Skyentific  2 роки тому

      Very good point. Probably you are right, I have not tried YOLO. I can only answer that the object detection with the Jetson Nano is still relatively slow (25fps). Which is better than the pose estimation (15fps), but not as fast as Jetson Xavier NX (more 60fps for pose estimation).

  • @turnedup28
    @turnedup28 2 роки тому

    How well does it work if you are farther from the camera? Most of your testing was really close up, but it would be neat to see it used for something like a sporting event. For a suggestion, try using the rule of thirds from framing the subject. It might not feel as creepy to the person being followed, but the video captured will look more natural.

  • @ikkeniikkewel
    @ikkeniikkewel 2 роки тому

    not bad.

  • @dwalthers
    @dwalthers Рік тому

    Would love to have a pair of eyes that work in unison in a paintable white material. Would you design and make these for me?

  • @armurak
    @armurak 2 роки тому +1

    That squid game killer

  • @joshieeee20
    @joshieeee20 Рік тому

    Could actually be useful if you had a spare smartphone mounted to it to record hd video if made it track your body, offest Infront of you and mount it on the ceiling somewhere

  • @johnkoester7795
    @johnkoester7795 2 роки тому

    That’s what I wanna do for my robotics give them the ability to see

  • @DPTech_workroom
    @DPTech_workroom 2 роки тому +2

    Прикольно получилось.
    Вот такую бы штуку в Украину на каждое здание, только с защитой против ракет и прочего мусора, что сыпется с неба от недоброжелательных рашшистов.
    У меня были проблемы с I2C датчиком на 3-х осевом стабилизаторе камеры. Тоже кабель шел параллельно силовым к моторам. (первое видео заставки канала)

    • @backgammonbacon
      @backgammonbacon 2 роки тому

      It turned out nice.
      That would be such a thing for Ukraine for every building, only with protection against missiles and other debris that falls from the sky from unfriendly rashshists.
      I had problems with the I2C sensor on the 3 axis camera gimbal. The cable also ran parallel to the power motors. (first channel intro video)

  • @IronChad_
    @IronChad_ 2 роки тому +1

    how did you get the Jetson nano? they keep going out of stock. please help

    • @Skyentific
      @Skyentific  2 роки тому

      I bought this jetson nano couple of years ago.

  • @FaithfulMC
    @FaithfulMC 2 роки тому +4

    Who won the RTX graphics card?

    • @Skyentific
      @Skyentific  2 роки тому +4

      It will be announced very soon (this comming week).

    • @oldemand
      @oldemand 2 роки тому

      @@Skyentific Did the winner get announced?

  • @constantinehelen9935
    @constantinehelen9935 2 роки тому +4

    What happens with 2 people in the field of view?? I know Posenet can do multiple people at once.

    • @Skyentific
      @Skyentific  2 роки тому +2

      Great question. It will follow the person with the greatest distance between the eyes (normally this is the closest person). Although, I have not tested this yet :)

    • @constantinehelen9935
      @constantinehelen9935 2 роки тому

      @@Skyentific ahh nice!! Great work! I love your channel :)

  • @ChrisSivanich
    @ChrisSivanich 2 роки тому +1

    Booting into the desktop environment then starting the tracking program might be wasting some performance - my experience with low power embedded applications has always been better without X/GNOME getting in the way. If the NVIDIA drivers require X, you can start it up without GNOME. Especially on those lower power boards, I'd love to see what impact it'd have.
    Also, I was excited to see how you'd implement and tune a PID algorithm for this, but you found a way around that with raw math 😄

    • @Skyentific
      @Skyentific  2 роки тому +1

      Completely agree, without Gnome it should be better. I will try.
      I think with analytical solution it should perform better than with perfectly tuned PID. And I am really bad at tuning PID :)))

    • @tyeth
      @tyeth 2 роки тому

      @@Skyentific Please let us know the results, the frame rate at 11min 48seconds onwards is no where near 15 frames per second (the rate mentioned when upgrading the nano to NX with ~80fps), I wonder if there is significant delay to the video processing when the robot "reacts", i.e. does things other than check video frames for faces. The gnome/X talk is definitely a good shout. I'm very sorry to admit I skipped some of the sections of the video, but plan to revisit once I have a Jetson :)

  • @gillespons4053
    @gillespons4053 2 роки тому +1

    What happens with more people on the frames ? 😅

    • @Skyentific
      @Skyentific  2 роки тому

      Great question! I programmed to detect closest person (more precisely person with greatest distance between two eyes). But I have not tested it yet :)

  • @fischX
    @fischX 2 роки тому

    Divide by zero is for all practical purposes perfectly estimated at 0
    - it is mathematical wrong but good enough ;)

  • @kahwigulum
    @kahwigulum Рік тому

    okay but what about two people
    who will it follow

  • @to1704
    @to1704 2 роки тому

    Ух у тебя акцент))
    Нафига такой большущий глаз? ))) там камера мелкая. А воще прикольно

  • @JimCGames
    @JimCGames 2 роки тому +1

    Built

  • @jakobfindlay4136
    @jakobfindlay4136 Рік тому

    I keep seeing Nvidia dev boards, what about the coral tpu?

  • @orhansezaikisioglu5038
    @orhansezaikisioglu5038 4 місяці тому

    Hello. Can you share the source codes? I am a student

  • @lucerino1973
    @lucerino1973 Рік тому

    Complimenti, pagherei per avere 1 /10 delle tue conoscenze

  • @sergeyworm1476
    @sergeyworm1476 2 роки тому

    How old is this boy? 🙂

  • @SirTodd.
    @SirTodd. 2 роки тому

    I want those sunglasses. Pure sexy.

  • @timsteel1060
    @timsteel1060 2 роки тому

    средняя церковно-приходская с английским уклоном ))))) у меня в голове не укладывается как можно так быстро говорить с таким акцентом )))) а воообще - иц эмэйзинг! сэнк ю фо юр джоб и все эти вещи. особенно ценно , что я понял каждое слово, когда обычно не понимаю и половины ))))

  • @Chris-bg8mk
    @Chris-bg8mk 2 роки тому +2

    🇺🇦

  • @andre7417
    @andre7417 2 роки тому

    I wonder how funny it would be to program the robot to do the opposite and instead of focusing on the person avoid 'eye' contact as much as possible. Completely useless but interesting nonetheless.

  • @mrpeaceful1
    @mrpeaceful1 Рік тому

    this video is dogecoin

  • @ivprojects8143
    @ivprojects8143 2 роки тому

    Really nice result! Well done.