- 27
- 56 525
Outsanda
France
Приєднався 28 бер 2012
The experience of movement all makes us reason
also impulses within virtual worlds
also impulses within virtual worlds
Touchdesigner png sequencer + playback
playback your png sequences at any desired speed ( max fps ? )
I haven't yet explored the limitations of this system, but the scripting seems to run fine so hope it'll be usefull for you !
.
download project for free :
www.patreon.com/posts/touchdesigner-115547664?Link&
I haven't yet explored the limitations of this system, but the scripting seems to run fine so hope it'll be usefull for you !
.
download project for free :
www.patreon.com/posts/touchdesigner-115547664?Link&
Переглядів: 103
Відео
Touchdesigner tutorials coming in november ✨
Переглядів 79Місяць тому
Here’s the updated list of tutorials I am preparing for november : - Attracting and repulsing Simon David Ryden’ ( aka supermarketsalad ) particle system with a MediaPipe and Kinect. - Creating a VR project ( oculus ) movement controls - Vertical projection mapping for a 360* video - Vertical projection mapping with kinect for interactive following light trail #touchdesigner #tutorials #update ...
After Dark - Mr Kitty ( uplifted with visual )
Переглядів 920Місяць тому
After Dark - Mr Kitty ( uplifted with visual )
After Dark - remix ( bass boosted + visual )
Переглядів 1,9 тис.Місяць тому
This is my take on After Dark by Mr Kitty All visuals created in TouchDesigner. Let me know in the comments if you enjoy this kind of project ! original song : After Dark - Mr Kitty
touchdesigner - 360 degree vertical projection mapping
Переглядів 2702 місяці тому
Made with the Kantan Mapper in TouchDesigner. Take a quick look at this 360 kantan projection mapping technique ! The trick here is to make the biggest trapeze you can make in the mapping window. TouchDesigner .toe available on my patreon :) . #touchdesigner #vertical #projectionmapping #art #dream #cabin
3D render picking - kinect touchdesigner tutorial
Переглядів 2185 місяців тому
Have you ever wanted to pick and be able to move objects in your scene ? Take a look at this fresh TouchDesigner renderpicking project I made you can download it for free on my patreon. If you need to configure the Kinect for the first time check out this link: derivative.ca/UserGuide/Kinect . The values may varry slightly depending on your distance to the kinect. I recommend matching the dista...
one million particles in a youth house
Переглядів 2808 місяців тому
🖥️ Particle Highways ⚙️ 2 projectors, 1 computer 🧑🏻 ~100 people - A particles system made of a million particles, moving in spiral motion. The event took place in MJC Lillebonne, Nancy, France on 16 november 2023 - Created in Touchdesigner from @supermarketsallad's tutorial. 🎵 Paolo Lucchi - Gemini (Dub)
interactive event in a local bioshop in Nancy
Переглядів 848 місяців тому
🖥️ Ion Eye ⚙️ 2 kinect, 2 projectors, 2 computers 🧑🏻 30 people - Energies, flux and movement at it’s core. Ion Eye is an ephemeral installation that took place at @court_circuit_nancy a grocery store, and café, selling local ethical products 🌎 This small entity in Nancy, is one of it’s kind. For this I felt the urge to do something truly unique ! I ended up exposing 2 artworks, one computer and...
outsanda - presentation
Переглядів 1428 місяців тому
Enjoy this array of unique techniques I'm working on in TouchDesigner. and a VR game I'm creating in Unreal Engine. Looking for some work opportunities in Paris. . #touchdesigner #interactivedesign #mediapipe #unrealengine #kinect #demo
Hand tracking using mediapipe - touchdesigner tutorial
Переглядів 20 тис.11 місяців тому
Hand tracking using mediapipe - touchdesigner tutorial
Hexagon floor | v.2 | Kinect and Leapmotion compatible.
Переглядів 429Рік тому
Hexagon floor | v.2 | Kinect and Leapmotion compatible.
Hexagon floor | v.1 | 360° mouse interactions
Переглядів 526Рік тому
Hexagon floor | v.1 | 360° mouse interactions
hand attracted particles | part 2 - touchdesigner tutorial
Переглядів 5 тис.Рік тому
hand attracted particles | part 2 - touchdesigner tutorial
Hand attracted particles - touchdesigner tutorial
Переглядів 24 тис.Рік тому
Hand attracted particles - touchdesigner tutorial
Thank you so much
Amazing work!
Thanks ! Cheers
I'm waiting! do you mentor?
for sure! Dm me on insta if you'd like to book a time
Can I do this with Kinect 1? What should I change in the parameters? Kinect CHOP or TOP
CHOP, You only need to change the parameter in the select to “hand_tx” instead of what I wrote
THIS TAKES IT TO ANOTHER LEVEL GOOD JOB
would this also work with the kinect too? I still havent started using touch designer but I know that the kinect doesnt have hand tracking natively, you are a god send for this video i love you
Yes hand tracking is limited to Mediapipe. So you can use your kinect as a simple camera, but a webcam would do the job.
Just ask me if you have any question ! :)
Does anyone know if there is a way to run the microsoft kinetc sdk on mac like with WineBottler anything similar? I have a PC for my media server that is running windows 10 as well I would just prefer to have the ability to use it with my laptop which is a mac.
To my knowledge the sdk only works on pc But you can use MediaPipe and webcam instead
So Madmapper has the ability to run a node with the xbox kinect running within their program even on a mac so that is about the only reliable and stable way i have found to use the kinect on a mac but it works great as long as what your doing is inside madmapper or what you want to work in will process an NDI video of the kinect feed out of madmapper
Hi, is there a way to maybe add some colors to these particles? great tutorial btw!!👏
Sure ! Before the end null you can insert a lookup, and plug your color in the second input of the lookup
@@outsandatv thank you for the response, it works!
In the Hand_Tracking Node, there is a white dot in the top left corner. Can you tell me how to remove it?
I think that might be the second hand you see packed in one corner, maybe try reduce to 1 hand detection if that’s all you need, in the Mediapipe -> HandTracking paramaters ( number of detected hands )
Nice tutorial but, how can I remove the rainy texture from the top (the top side of the screen)
For the particles SOP you can Increase the turbulence for less straight particle spawns, but you might want to consider a particles GPU approach or TOP, GLSL to have more control over the birth points. I will release a tutorial for making this with TOPs only.
Hi! Nice tutorial :) How can I make the particle system born from the hands? Thank you!
Do you want it in 2d or 3d ?
@@outsandatv Hi! Thank you for sharing this tutorial. I am trying to make the particles being born around the hands or perhaps the body?
@@k4nuto hey man, I will look into it ! we need to use particles GPU instead of particles SOP for better performance
Thank you for the clear tutorial. I am a TD beginner. Can you make a video of using mediapipe to make hand attracted partical? I tried it but it didn't work.
Yes sure
Kurt Cobain
Hello! I'm Korean Uni student, I really appreciate to your video. It is super helpful for me. But I wonder if I use some video instead of Kinect TOP(by web camera), how can I connect video's person's hand with particle system?
Hello, same here I study arts and design, to achieve this you can use Mediapipe, I will do a quick tutorial to show you how.
@@outsandatv Thank you so much!
Really helpful tutorial! Could you make a tutorial about creating our own gesture to be recognized, also controlling some motion graphics at the same time maybe. thanks!
Love the idea. Will work on it.
I have worked with basic gestures so far like X,Y,Z but there are infinite ways to associate motion patterns with specific commands, what did you mean by motion graphics ?
how can replace the inputs from the mouse with something else?
I want to replace it with another coordinate data
@@VeraArt-wt4nl I don't understand your problem could you explain in more details please ?
MouseIn CHOP to get all mouse data - Mediapipe to get webcam data - Kinect to get depth data
If you mean the Mediapipe cursor by "mouse" we can use other Pose data like elbows or foot tip, so you can skip the modification of the model and use the data directly from Mediapipe - Pose Tracking - Select the body part you want. And create as many math as you have coordinates ( X, Y, Z ) to change the data to something you need.
@@outsandatv i used rename op to rename mediapipe data to tx ty so i can replace my projects with mouseIn op thank you.
nice! Can you drop a link to the TD patch?
www.patreon.com/posts/attracted-102011982?Link&
can i exchange the circle to a image?
Sure, message me on insta if you want me to show you @outsanda
Just drop your picture in the project and add a Transform and use the position reference of the circle in the Transform and plug it instead of the circle.
Very Nice!!!
love the screen takeovers!
how do I use mediapipe with an uploaded video instead of using webcam?
You need to : 1 - go to this github page github.com/leadedge/SpoutCam/releases 2 - install and extract the zip on your pc 3 - execute "SpoutCamSettings.exe" 4 - set frame rate of your video 5 - set resolution 6 - name it "TDSyphonSpoutOut2" and register 7 - reopen your saved project with MediaPipe, and select SpoutCam instead of your webcam.
hey thanks for the tutorial but can you help me with mediapipe i can't find mirror image =)
It’s right on the main Mediapipe paramater page, where you select your camera device and stuff.
Hi absolutely amazing work, just wanted to ask, what values in the Math operators should I change if I want to stand at a further distance from the Kinect? Thank you.
The multiply parameter, turn them down to 1 for each math and it'll be perfect.
@@outsandatv Thank you so much, I really appreciate it!
@@bobbyhuang5520 at your disposal !
Here is the full project ( MediaPipe + Kinect + mouse ) ✨✋ www.patreon.com/posts/102011982?
heylooo am having trouble connecting the hand tracking to activate the particles without a kinect. i dont understand the patreon link:/ can u write a more step to step i would appreciate so much
@@clairecahill7214 you need to install Mediapipe for using this project without a kinect, here is the link and the instructions you will find : 1- In GitHub download latest release github.com/torinmb/mediapipe-touchdesigner 2 - drop MediaPipe.tox into project ( and flip horizontal ) 3 - drop Pose tracking next to it, connect mediapipe to it ( pose ) 4 - drag Top can you dm me on insta ? outsanda, so you can show me your problem
😍cool
Hey, could you help me?
MediaPipe connected with particle system here : www.patreon.com/posts/attracted-102011982?Link&
Is there any way to work around and replace Kinect to achieve the hand particle video with mediapipe? I saw your first particle video and wanted to do it without Kinect. Kinect doesn’t work well with MacOS.
Sure ! to replace the kinect hand x, y by the mediapipe x, y you jeed to : - add mediapipe to the particle project - add the hand or pose tracking component - use the normalized data from it ( so as I show in this video, place a null then a select CHOP after the component where it says “normalized data” and in the select choose “wrist_x” and “wrist_y”, then drag and drop it to the transform x, y of the metaball SOP. So basically follow both tutorials and when i reference the positions, use mediapipe instead of a kinect. I will upload the project on my patreon now that you are saying it would benefit mac users. I appreciate your feedback in that regard !
www.patreon.com/posts/attracted-102011982?Link&
@@outsandatv Oh yes, I previously subscribed to your Patreon. Different name though. One last question how do you address the distance and angles when you may be like 10 meters away with a 2K web camera. Is there anyway to have mediapipe still recognize my body and movements?
@@robertjohnson4051 If the detection is sloppy you can zoom into the picture by changing the scale in a transform TOP, if you're standing in a specific part of the image. If the subject is moving from left to right let's say, at a far distance, you could use the subject spine position X, Y to always have yourself upscaled in the center.
@@outsandatv thanks for your awesome help. You saved my show! On a separate note, If I were you, I would start charging a small fee for your Patreon. And uploading projects that you do online and making them accessible to people like me. I like learning from the tutorials, however at this time I’m pressed because my show is starting very soon and to buy projects and slightly adjust them would be wonderful for beginners like me. Just putting that out there and hoping for your success. I’m one of your patrons!
Thanks for your sharing!
many thanks!!!!!!!!!!
Hey, there are several things I have to point out to viewers that need to be fixed in this video. 1) There's no need to modify the component to get the middle pointer finger. That channel is already exposed through the CHOP data. You can just put down a Select CHOP, connect it to the first CHOP output of the Hand Tracking component, and select the channel: h*:middle_finger_tip:* to get the data you're looking for. The data is normalized from 0-1 so you can use a Math CHOP to scale the positions to the correct size you need for your display. 2) I wouldn't recommend putting the MediaPipe component in the Palette. It's very large and we externalize it so doesn't take forever to save the project. Make sure you have the toxes folder located next to where you save your TD project / local to your TD project. If it's not re-opening correctly look in the Common tab of the MediaPipe component, check the path used for the External tox, and make sure the path points to the local MediaPipe.tox (currently yours is set to the wrong path). This path issue is also fixed in the newest version of the MediaPipe component, so the project should re-open correctly by default. 3) You can turn off the green points by turning off "Show Overlays" on the MediaPipe component. No need for SpoutCam for that. I'm really excited to see you experiment with MediaPipe, and I'd love to see you make a new tutorial with the updated info!! I think you do a great job of explaining your process and making things very approachable. Hope to see new tutorials from you soon!
Oh thank you for clarifying these elements Torin, I should indeed have mentioned it is possible to get the data out of the box. And thank you for explaining my path issue that I'll be sure to make notice for my next tut. Best
How would you recommend learning to do something like this?
You know the best way to learn is to practise first, then use your creativity with the knowledge you acquire. I watched a video about render picking, this technique Bileam Tchepe @elektronaut uses in his "drum kick" tutorial. I had the idea to use this technique for people detection, so I buildt a grid where i crop small areas of the video. The most enjoyfull part in this project was to get the approximate position in X and Y axis.
i needed one help, currently I am working on sign language detection using mediapipe, its a python project, but i wanted to add text to audio support feature in real time , can you please show how can we achieve that, it would be really appreciable .
That's a great idea of a project, do you need a text to voice AI ? I will look into it, but for now I think you have to pay to use an API key
Hey do you have any idea why the Y axis on one of my hands does not seem to be reading any data?
Is it occuring from the moment you load your component ?
worked it out, accidentally had the index finger selected lol. thanks for reply tho :) @@outsandatv
Can I get the file of this work from you? I'm having some problems BY I am a college student from Taiwan
You should be able to remake it yourself in less then 5 minutes, could you describe your problem ?
I will publish it on my patreon for you
I want to blend the kinect image and particles together, but the kinect background will disappear.@@outsandatv
@@outsandatv super thanks
How can I use a vertical resolution camera?
No clue, I have to try it out. Thanks for the idea :)
@@outsandatv Thanks, I've tried everything since then and Mediapipe recognized me even when the camera was physically upside down or tilted 90 degrees, so I solved the problem by setting the parameter to be used to Negate the Channel pre op of MathCHOP in SelectCHOP! :)
I wonder how you achieved this because Mediapipe works directly with the video device. But I would imagine it's possbile with Spout using a fit and transform to control the rotation of the camera. @@user-lx8if5xy9r
nice flip em off
Hey man, at 1:52 you connect the particle sop to the geo but im unable to do that and it just shows the donut. Could you help me with that? Thanks and love from The Netherlands!
You need to drop the geometry COMP from the output of the particle SOP. Just press tab after dragging the node connection out. Hope that helped !
@@outsandatv Thanks! That worked!
がんばろう。
Can we make one without Kinect?
Yes on my last tutorial i show you how to get the x y from a finger using a webcam.
@@outsandatv Can I use just video input? Not a videocamera?
@bbrother92 Yes, to detect hand data from a video 1 _ install OBS Studio, install Mediapipe and drag into blank touchdesigner project. 2 _ place video in OBS, resize, delete all other entries, set it to loop in the settings and finally press "Start virtual camera" 3 _ Back in touchdesigner drag Mediapipe pose tracking ( for example ) and connect it 4 _ in main Mediapipe, in webcam selection menu, choose OBS Virtual Camera 5 _ create a select CHOP from pose tracking, and select "left_wrist:x" "left_wrist:y"
@@bbrother92 Yes, to detect hand data from a video 1 _ install OBS Studio, install Mediapipe and drag into blank touchdesigner project. 2 _ place video in OBS, resize, delete all other entries, set it to loop in the settings and finally press "Start virtual camera" 3 _ Back in touchdesigner drag Mediapipe pose tracking ( for example ) and connect it 4 _ in main Mediapipe, in webcam selection menu, choose OBS Virtual Camera 5 _ create a select CHOP from pose tracking, and select "left_wrist:x" "left_wrist:y"
Cool tutorial. Why did you dive into the mediapipe operator and not use a select on the outcoming data? But maybe it reduces load on the computer. Nice to see the insides. Thank you.
You’re right, that was my thinking but I can’t really say if it makes it more efficient. Selecting the data directly is totally possible without having to change the component. I should have been more clear on that.
Can I use this over found videos or footage I’ve shot?
Yes you just need to drag and drop your footage in the program, then connect a level TOP to adjust the opacity and connect it to the comp top at the end
thanks for your sharing!the tutorial is clear and detailed that it has helped me a lot!
Hello! I am very happy that I am able to complete this tutorial but using a kinect I use the laptop webcam with a mediapipe and it works great. Only thing is that the coordinates of the hand only move the -x-y axis, is not centered and when the value reaches 0, i don't know if I explained myself but can you please let me know how to make the xy axis work at all the coordinates moving the hand from the center of the webcam to all the possible parameter of the screen? thank you!
I have try it out myself to be honest with you, but you can get the point near 0 and move in the right directions by changing the operation to negate and increasing or decreasing the post-add. But i will try and tell you the solution
My guess is for x : negate and for y : dont negate
@@outsandatv i will try it!!! I’m almost there!!! I’ve spent the last 4 days making tutorials and yours are the best!!
Followed your tutorial aand was trying to move a fluid simulation with my hands but they are mirrored, do you know how to flip it and set the boundaries of the screen to match the hand movements xy? Thank you for your time explaining this amazing art.
If you want your data to decrease instead of increase we use a math to inverse everything and set it back to the same value. Like i showed, change the math /channel pre OP/ to negate and in the second window increase the /post-add/ to the value you need.
Or probably if you followed the tutorial you just need to remove the “Negate” and it should work
@@outsandatv this helped!!!! I am now able to move it on the X how it supposed to be looking at the projection in front of it. Do you know how to change the value of y using a math? Thank you for your help!!!!
@@yomi0ne for y you probably wont need to negate. Maybe post add a bit.
@@outsandatv i appreciate your help!!! I’m currently doing the particles tutorial now. I need to play with the add as you say because the particles are on the upper corner of the screen and not centered. Thank you for taking the time to respond and to show us this tutorials
This is so cool
cool af dude thanks
hmmm
hey would this work using a laptop webcam?
Absolutely, using Mediapipe, see my latest short. tutorial coming soon
hi Outsanda, i followed your tutorial as it is.. but the particles are not showing in my file like they are in yours i don't know why that is... i am working with a kinect v2.
Hi ! feel free to share a link to your project so i can take a look at what might have gone wrong. Otherwise you might have missed something. Make sure the transform SOP is correct.
I'm gonna publish the working version on my patreon soon.
@@outsandatv thanks for the quick response! i actually solved that issue but please help me with another problem at 4:35 when you drag the null CHOP values to metaball SOP what option did you select? it says Export chop, chop reference, current chop value. if i do export CHOP the values just remain 0
Choose reference, but export also works
@@outsandatv Thank you so much for help!! everything works great now😊