heylooo am having trouble connecting the hand tracking to activate the particles without a kinect. i dont understand the patreon link:/ can u write a more step to step i would appreciate so much
@@clairecahill7214 you need to install Mediapipe for using this project without a kinect, here is the link and the instructions you will find : 1- In GitHub download latest release github.com/torinmb/mediapipe-touchdesigner 2 - drop MediaPipe.tox into project ( and flip horizontal ) 3 - drop Pose tracking next to it, connect mediapipe to it ( pose ) 4 - drag Top can you dm me on insta ? outsanda, so you can show me your problem
Hello! I am very happy that I am able to complete this tutorial but using a kinect I use the laptop webcam with a mediapipe and it works great. Only thing is that the coordinates of the hand only move the -x-y axis, is not centered and when the value reaches 0, i don't know if I explained myself but can you please let me know how to make the xy axis work at all the coordinates moving the hand from the center of the webcam to all the possible parameter of the screen? thank you!
I have try it out myself to be honest with you, but you can get the point near 0 and move in the right directions by changing the operation to negate and increasing or decreasing the post-add. But i will try and tell you the solution
Hello, this is really great stuff! Thank you for this detailed, yet easy to follow tutorial! I'm still sort of new to these stuff, but just wondering if it is possible to follow through without having to use Kinect? A webcam perhaps?
Hello thank you for you inquiry and curiosity. It's something I've been working on to do, there's something called a "trace" sop which allow you to make 3D shapes out of an RGB image, and if you manipulate light correctly it's cleary possible, but it will also depend on the amount of light you're exposed too.
Yes you just need to drag and drop your footage in the program, then connect a level TOP to adjust the opacity and connect it to the comp top at the end
Hey man, at 1:52 you connect the particle sop to the geo but im unable to do that and it just shows the donut. Could you help me with that? Thanks and love from The Netherlands!
Hello! I'm Korean Uni student, I really appreciate to your video. It is super helpful for me. But I wonder if I use some video instead of Kinect TOP(by web camera), how can I connect video's person's hand with particle system?
Does anyone know if there is a way to run the microsoft kinetc sdk on mac like with WineBottler anything similar? I have a PC for my media server that is running windows 10 as well I would just prefer to have the ability to use it with my laptop which is a mac.
So Madmapper has the ability to run a node with the xbox kinect running within their program even on a mac so that is about the only reliable and stable way i have found to use the kinect on a mac but it works great as long as what your doing is inside madmapper or what you want to work in will process an NDI video of the kinect feed out of madmapper
hi Outsanda, i followed your tutorial as it is.. but the particles are not showing in my file like they are in yours i don't know why that is... i am working with a kinect v2.
Hi ! feel free to share a link to your project so i can take a look at what might have gone wrong. Otherwise you might have missed something. Make sure the transform SOP is correct.
@@outsandatv thanks for the quick response! i actually solved that issue but please help me with another problem at 4:35 when you drag the null CHOP values to metaball SOP what option did you select? it says Export chop, chop reference, current chop value. if i do export CHOP the values just remain 0
Thanks for your sharing!
thanks for your sharing!the tutorial is clear and detailed that it has helped me a lot!
many thanks!!!!!!!!!!
Here is the full project ( MediaPipe + Kinect + mouse ) ✨✋
www.patreon.com/posts/102011982?
heylooo am having trouble connecting the hand tracking to activate the particles without a kinect. i dont understand the patreon link:/ can u write a more step to step i would appreciate so much
@@clairecahill7214 you need to install Mediapipe for using this project without a kinect, here is the link and the instructions you will find :
1- In GitHub download latest release github.com/torinmb/mediapipe-touchdesigner
2 - drop MediaPipe.tox into project ( and flip horizontal )
3 - drop Pose tracking next to it, connect mediapipe to it ( pose )
4 - drag Top
can you dm me on insta ? outsanda, so you can show me your problem
Hello! I am very happy that I am able to complete this tutorial but using a kinect I use the laptop webcam with a mediapipe and it works great. Only thing is that the coordinates of the hand only move the -x-y axis, is not centered and when the value reaches 0, i don't know if I explained myself but can you please let me know how to make the xy axis work at all the coordinates moving the hand from the center of the webcam to all the possible parameter of the screen? thank you!
I have try it out myself to be honest with you, but you can get the point near 0 and move in the right directions by changing the operation to negate and increasing or decreasing the post-add. But i will try and tell you the solution
My guess is for x : negate and for y : dont negate
@@outsandatv i will try it!!! I’m almost there!!! I’ve spent the last 4 days making tutorials and yours are the best!!
Hello, this is really great stuff! Thank you for this detailed, yet easy to follow tutorial!
I'm still sort of new to these stuff, but just wondering if it is possible to follow through without having to use Kinect? A webcam perhaps?
Hello thank you for you inquiry and curiosity.
It's something I've been working on to do, there's something called a "trace" sop which allow you to make 3D shapes out of an RGB image, and if you manipulate light correctly it's cleary possible, but it will also depend on the amount of light you're exposed too.
its totally possible with MediaPipe, i will do a tutorial that everyone can follow using that plugin
Can I do this with Kinect 1? What should I change in the parameters? Kinect CHOP or TOP
CHOP,
You only need to change the parameter in the select to “hand_tx” instead of what I wrote
Can I use this over found videos or footage I’ve shot?
Yes you just need to drag and drop your footage in the program, then connect a level TOP to adjust the opacity and connect it to the comp top at the end
Hi Outsanda, What if both hands were to connect the kinect with the metaball?
That is fairly simple, I will do a tutorial to demonstrate ! Great idea actually I appreciate your comment.
@Ardsadprm I made a new tutorial as promised with this adjustment of 2 metaballs, more accurate and fun as expected !
@@outsandatv Hey, I just tried it this is amazing, thank you very much
hey would this work using a laptop webcam?
Absolutely, using Mediapipe, see my latest short. tutorial coming soon
Hey man, at 1:52 you connect the particle sop to the geo but im unable to do that and it just shows the donut. Could you help me with that? Thanks and love from The Netherlands!
You need to drop the geometry COMP from the output of the particle SOP. Just press tab after dragging the node connection out. Hope that helped !
@@outsandatv Thanks! That worked!
hey thanks for the tutorial but can you help me with mediapipe i can't find mirror image =)
It’s right on the main Mediapipe paramater page, where you select your camera device and stuff.
Hello! I'm Korean Uni student, I really appreciate to your video. It is super helpful for me. But I wonder if I use some video instead of Kinect TOP(by web camera), how can I connect video's person's hand with particle system?
Hello, same here I study arts and design, to achieve this you can use Mediapipe, I will do a quick tutorial to show you how.
@@outsandatv Thank you so much!
Does anyone know if there is a way to run the microsoft kinetc sdk on mac like with WineBottler anything similar? I have a PC for my media server that is running windows 10 as well I would just prefer to have the ability to use it with my laptop which is a mac.
To my knowledge the sdk only works on pc
But you can use MediaPipe and webcam instead
So Madmapper has the ability to run a node with the xbox kinect running within their program even on a mac so that is about the only reliable and stable way i have found to use the kinect on a mac but it works great as long as what your doing is inside madmapper or what you want to work in will process an NDI video of the kinect feed out of madmapper
Can I get the file of this work from you?
I'm having some problems
BY I am a college student from Taiwan
You should be able to remake it yourself in less then 5 minutes, could you describe your problem ?
I will publish it on my patreon for you
I want to blend the kinect image and particles together, but the kinect background will disappear.@@outsandatv
@@outsandatv super thanks
hi Outsanda, i followed your tutorial as it is.. but the particles are not showing in my file like they are in yours
i don't know why that is... i am working with a kinect v2.
Hi ! feel free to share a link to your project so i can take a look at what might have gone wrong.
Otherwise you might have missed something. Make sure the transform SOP is correct.
I'm gonna publish the working version on my patreon soon.
@@outsandatv thanks for the quick response! i actually solved that issue but please help me with another problem at 4:35 when you drag the null CHOP values to metaball SOP what option did you select? it says Export chop, chop reference, current chop value. if i do export CHOP the values just remain 0
Choose reference, but export also works
@@outsandatv Thank you so much for help!! everything works great now😊