Hand tracking using mediapipe - touchdesigner tutorial

Поділитися
Вставка
  • Опубліковано 29 лис 2024

КОМЕНТАРІ • 55

  • @blankensmithing
    @blankensmithing 8 місяців тому +9

    Hey, there are several things I have to point out to viewers that need to be fixed in this video.
    1) There's no need to modify the component to get the middle pointer finger. That channel is already exposed through the CHOP data. You can just put down a Select CHOP, connect it to the first CHOP output of the Hand Tracking component, and select the channel: h*:middle_finger_tip:* to get the data you're looking for. The data is normalized from 0-1 so you can use a Math CHOP to scale the positions to the correct size you need for your display.
    2) I wouldn't recommend putting the MediaPipe component in the Palette. It's very large and we externalize it so doesn't take forever to save the project. Make sure you have the toxes folder located next to where you save your TD project / local to your TD project. If it's not re-opening correctly look in the Common tab of the MediaPipe component, check the path used for the External tox, and make sure the path points to the local MediaPipe.tox (currently yours is set to the wrong path). This path issue is also fixed in the newest version of the MediaPipe component, so the project should re-open correctly by default.
    3) You can turn off the green points by turning off "Show Overlays" on the MediaPipe component. No need for SpoutCam for that.
    I'm really excited to see you experiment with MediaPipe, and I'd love to see you make a new tutorial with the updated info!! I think you do a great job of explaining your process and making things very approachable. Hope to see new tutorials from you soon!

    • @outsandatv
      @outsandatv  8 місяців тому +1

      Oh thank you for clarifying these elements Torin, I should indeed have mentioned it is possible to get the data out of the box. And thank you for explaining my path issue that I'll be sure to make notice for my next tut. Best

  • @freshestveggies6476
    @freshestveggies6476 5 місяців тому +14

    Kurt Cobain

  • @Avalonanon
    @Avalonanon 11 місяців тому +5

    cool af dude thanks

  • @markus_knoedel
    @markus_knoedel 11 місяців тому +2

    Cool tutorial. Why did you dive into the mediapipe operator and not use a select on the outcoming data? But maybe it reduces load on the computer. Nice to see the insides. Thank you.

    • @outsandatv
      @outsandatv  11 місяців тому +1

      You’re right, that was my thinking but I can’t really say if it makes it more efficient.
      Selecting the data directly is totally possible without having to change the component. I should have been more clear on that.

  • @alonsoaguayouwuowo3609
    @alonsoaguayouwuowo3609 3 дні тому

    Thank you so much

  • @yoruchen1783
    @yoruchen1783 5 місяців тому

    Really helpful tutorial! Could you make a tutorial about creating our own gesture to be recognized, also controlling some motion graphics at the same time maybe. thanks!

    • @outsandatv
      @outsandatv  5 місяців тому

      Love the idea. Will work on it.

    • @outsandatv
      @outsandatv  5 місяців тому

      I have worked with basic gestures so far like X,Y,Z but there are infinite ways to associate motion patterns with specific commands, what did you mean by motion graphics ?

  • @余庆-f2b
    @余庆-f2b 3 дні тому

    Thank you sooooo much bro~

  • @artglobal9991
    @artglobal9991 5 місяців тому +1

    Thank you for the clear tutorial. I am a TD beginner. Can you make a video of using mediapipe to make hand attracted partical? I tried it but it didn't work.

  • @acropolisfalls
    @acropolisfalls 2 місяці тому +1

    would this also work with the kinect too? I still havent started using touch designer but I know that the kinect doesnt have hand tracking natively, you are a god send for this video i love you

    • @outsandatv
      @outsandatv  2 місяці тому

      Yes hand tracking is limited to Mediapipe.
      So you can use your kinect as a simple camera, but a webcam would do the job.

    • @outsandatv
      @outsandatv  2 місяці тому

      Just ask me if you have any question ! :)

  • @sologrinder123
    @sologrinder123 8 місяців тому +1

    i needed one help, currently I am working on sign language detection using mediapipe, its a python project, but i wanted to add text to audio support feature in real time , can you please show how can we achieve that, it would be really appreciable .

    • @outsandatv
      @outsandatv  8 місяців тому +1

      That's a great idea of a project, do you need a text to voice AI ? I will look into it, but for now I think you have to pay to use an API key

  • @NachitenRemix
    @NachitenRemix 11 місяців тому +1

    This is so cool

  • @Neomu_Joah
    @Neomu_Joah 4 місяці тому

    In the Hand_Tracking Node, there is a white dot in the top left corner. Can you tell me how to remove it?

    • @outsandatv
      @outsandatv  4 місяці тому

      I think that might be the second hand you see packed in one corner, maybe try reduce to 1 hand detection if that’s all you need, in the Mediapipe -> HandTracking paramaters ( number of detected hands )

  • @vaniabisbal3594
    @vaniabisbal3594 6 місяців тому +1

    how do I use mediapipe with an uploaded video instead of using webcam?

    • @outsandatv
      @outsandatv  6 місяців тому

      You need to :
      1 - go to this github page
      github.com/leadedge/SpoutCam/releases
      2 - install and extract the zip on your pc
      3 - execute "SpoutCamSettings.exe"
      4 - set frame rate of your video
      5 - set resolution
      6 - name it "TDSyphonSpoutOut2" and register
      7 - reopen your saved project with MediaPipe, and select SpoutCam instead of your webcam.

  • @outsandatv
    @outsandatv  7 місяців тому

    MediaPipe connected with particle system here :
    www.patreon.com/posts/attracted-102011982?Link&

  • @yomi0ne
    @yomi0ne 11 місяців тому +1

    Followed your tutorial aand was trying to move a fluid simulation with my hands but they are mirrored, do you know how to flip it and set the boundaries of the screen to match the hand movements xy? Thank you for your time explaining this amazing art.

    • @outsandatv
      @outsandatv  11 місяців тому +1

      If you want your data to decrease instead of increase we use a math to inverse everything and set it back to the same value.
      Like i showed, change the math /channel pre OP/ to negate
      and in the second window increase the /post-add/ to the value you need.

    • @outsandatv
      @outsandatv  11 місяців тому +1

      Or probably if you followed the tutorial you just need to remove the “Negate” and it should work

    • @yomi0ne
      @yomi0ne 11 місяців тому +1

      @@outsandatv this helped!!!! I am now able to move it on the X how it supposed to be looking at the projection in front of it. Do you know how to change the value of y using a math? Thank you for your help!!!!

    • @outsandatv
      @outsandatv  11 місяців тому +1

      @@yomi0ne for y you probably wont need to negate. Maybe post add a bit.

    • @yomi0ne
      @yomi0ne 11 місяців тому +1

      @@outsandatv i appreciate your help!!! I’m currently doing the particles tutorial now. I need to play with the add as you say because the particles are on the upper corner of the screen and not centered. Thank you for taking the time to respond and to show us this tutorials

  • @solomonclements3830
    @solomonclements3830 9 місяців тому

    Hey do you have any idea why the Y axis on one of my hands does not seem to be reading any data?

    • @outsandatv
      @outsandatv  9 місяців тому

      Is it occuring from the moment you load your component ?

    • @solomonclements3830
      @solomonclements3830 9 місяців тому +1

      worked it out, accidentally had the index finger selected lol. thanks for reply tho :)
      @@outsandatv

  • @VeraArt-wt4nl
    @VeraArt-wt4nl 6 місяців тому +1

    how can replace the inputs from the mouse with something else?

    • @VeraArt-wt4nl
      @VeraArt-wt4nl 6 місяців тому

      I want to replace it with another coordinate data

    • @outsandatv
      @outsandatv  5 місяців тому

      @@VeraArt-wt4nl I don't understand your problem could you explain in more details please ?

    • @outsandatv
      @outsandatv  5 місяців тому

      MouseIn CHOP to get all mouse data - Mediapipe to get webcam data - Kinect to get depth data

    • @outsandatv
      @outsandatv  5 місяців тому

      If you mean the Mediapipe cursor by "mouse" we can use other Pose data like elbows or foot tip, so you can skip the modification of the model and use the data directly from Mediapipe - Pose Tracking - Select the body part you want. And create as many math as you have coordinates ( X, Y, Z ) to change the data to something you need.

    • @VeraArt-wt4nl
      @VeraArt-wt4nl 5 місяців тому +1

      @@outsandatv i used rename op to rename mediapipe data to tx ty so i can replace my projects with mouseIn op thank you.

  • @robertjohnson4051
    @robertjohnson4051 7 місяців тому

    Is there any way to work around and replace Kinect to achieve the hand particle video with mediapipe?
    I saw your first particle video and wanted to do it without Kinect.
    Kinect doesn’t work well with MacOS.

    • @outsandatv
      @outsandatv  7 місяців тому +1

      Sure ! to replace the kinect hand x, y by the mediapipe x, y you jeed to :
      - add mediapipe to the particle project
      - add the hand or pose tracking component
      - use the normalized data from it ( so as I show in this video, place a null then a select CHOP after the component where it says “normalized data” and in the select choose “wrist_x” and “wrist_y”, then drag and drop it to the transform x, y of the metaball SOP.
      So basically follow both tutorials and when i reference the positions, use mediapipe instead of a kinect. I will upload the project on my patreon now that you are saying it would benefit mac users. I appreciate your feedback in that regard !

    • @outsandatv
      @outsandatv  7 місяців тому +1

      www.patreon.com/posts/attracted-102011982?Link&

    • @robertjohnson4051
      @robertjohnson4051 7 місяців тому

      @@outsandatv Oh yes, I previously subscribed to your Patreon. Different name though. One last question how do you address the distance and angles when you may be like 10 meters away with a 2K web camera. Is there anyway to have mediapipe still recognize my body and movements?

    • @outsandatv
      @outsandatv  7 місяців тому +1

      @@robertjohnson4051 If the detection is sloppy you can zoom into the picture by changing the scale in a transform TOP, if you're standing in a specific part of the image. If the subject is moving from left to right let's say, at a far distance, you could use the subject spine position X, Y to always have yourself upscaled in the center.

    • @robertjohnson4051
      @robertjohnson4051 7 місяців тому

      @@outsandatv thanks for your awesome help. You saved my show!
      On a separate note, If I were you, I would start charging a small fee for your Patreon. And uploading projects that you do online and making them accessible to people like me. I like learning from the tutorials, however at this time I’m pressed because my show is starting very soon and to buy projects and slightly adjust them would be wonderful for beginners like me. Just putting that out there and hoping for your success. I’m one of your patrons!

  • @hanrongwoon8477
    @hanrongwoon8477 6 місяців тому

    can i exchange the circle to a image?

    • @outsandatv
      @outsandatv  5 місяців тому

      Sure, message me on insta if you want me to show you @outsanda

    • @outsandatv
      @outsandatv  5 місяців тому

      Just drop your picture in the project and add a Transform and use the position reference of the circle in the Transform and plug it instead of the circle.

  • @user-lx8if5xy9r
    @user-lx8if5xy9r 11 місяців тому

    How can I use a vertical resolution camera?

    • @outsandatv
      @outsandatv  11 місяців тому

      No clue, I have to try it out. Thanks for the idea :)

    • @user-lx8if5xy9r
      @user-lx8if5xy9r 10 місяців тому

      @@outsandatv Thanks, I've tried everything since then and Mediapipe recognized me even when the camera was physically upside down or tilted 90 degrees, so I solved the problem by setting the parameter to be used to Negate the Channel pre op of MathCHOP in SelectCHOP! :)

    • @outsandatv
      @outsandatv  10 місяців тому

      I wonder how you achieved this because Mediapipe works directly with the video device. But I would imagine it's possbile with Spout using a fit and transform to control the rotation of the camera. @@user-lx8if5xy9r

  • @terryfarmer6279
    @terryfarmer6279 11 місяців тому

    hmmm