Kinect Azure Point Cloud in TouchDesigner Tutorial

Поділитися
Вставка
  • Опубліковано 27 тра 2024
  • Get access to 200+ hours of TouchDesigner video training, a private Facebook group where Elburz and Matthew Ragan answer all your questions, and twice-monthly group coaching/mastermind calls here: iihq.tv/Trial
    If you’re a TouchDesigner Beginner, check out our TouchDesigner Tutorial Series! We teach you all the basics of how to use TouchDesigner, including mouse controls, hotkeys, parameters, the operator families, and more: interactiveimmersive.io/touch...
    The Kinect Azure is a fantastic new sensor to use for your depth scanning and skeleton tracking purposes. Most people are surprised by how easily you can create a GPU accelerated point cloud of instanced geometry with it! In this video, Elburz shows you the basics of working with your Kinect Azure in TouchDesigner and how to put together a quick colour point cloud with only a handful of operators.

КОМЕНТАРІ • 61

  • @gregderivative2647
    @gregderivative2647 2 роки тому +1

    Hey Elburz, this is a great tour through the Azure .

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  2 роки тому

      Thanks Greg! We'll have a bunch more Azure pieces now that I finally got my hands on one :)

  • @kblinse06
    @kblinse06 2 роки тому +1

    Subscribed! Thanks so much for this, I'm looking forward to the videos coming out and I'll def check your program out!

  • @mattsoson
    @mattsoson 2 роки тому +1

    Great meeting you at LDI! Do you think it possible to output two offset versions of this for left and right eye and send the two images through NDI or whatever into Unity and send each to corresponding eye in connected HMD for 3D viewing? Could do a similar effect with shader graph in Unity natively, but thinking of other possibilities with more efficient processing in TD…

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  2 роки тому

      Hi Matt! Our pleasure :) What you could do is setup the instancing setup like this, then create two Render TOPs and two Camera COMPs, then you have move the cameras slightly off from each other which would give you your left/right eye renders that you can NDI over into Unity. Could you give something like that a try?

    • @mattsoson
      @mattsoson 2 роки тому +1

      @@TheInteractiveImmersiveHQ yah I think that was the idea, thanks for the wise/affirmative nod! Looking forward to playing with iPhone LiDAR too from your other vids.

  • @cjadams7434
    @cjadams7434 2 роки тому +1

    Very cool effect! - Question...does this require TD-Pro...or can this be done with Commercial?

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  2 роки тому

      Absolutely can be done with commercial. It would even work on the free learning version as well, since the resolution of the point clouds is under 1280x720.

  • @ianemcdermott5090
    @ianemcdermott5090 2 роки тому +1

    Great video! Is there any way to import an mkv file from the Kinect DK Recorder and use that for point cloud data in place of a live azure? Thanks!

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  2 роки тому

      I haven't done that personally but you could try pointing a Movie File In TOP at the mkv file and seeing if it loads it up. If not, you might need to do an intermediary step of converting the MKV file to another format that supports 32-bit depth, is lossless, and can be read by TouchDesigner (like an exr sequence or similar).

  • @Psybernetiks
    @Psybernetiks 2 роки тому +7

    Hey! following the same steps, when I right click my kinect top to view as points, all the points inside the top goes mad and the image just randomly shifts around and the same issue is with the geo. sop. What could be the issue

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  2 роки тому +1

      Before you go into view as points mode, do you textures look similar to mine or are the flat textures also going wild? What kind of GPU do you have as well?

    • @kblinse06
      @kblinse06 2 роки тому

      @@TheInteractiveImmersiveHQ I'm getting the same issue. There is a red x in the upper right corner of my math and null operators and the picture in geo is the depth image but its moving around like crazy. I have a 3090, so it shouldn't be the card. but it is the free version of TD, so maybe it has something to do with that?

    • @kblinse06
      @kblinse06 2 роки тому +8

      @@TheInteractiveImmersiveHQ Solved my problem by turning 'adaptive homing' off

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  2 роки тому +3

      @@kblinse06 oh great! Yes with any kind of particle system or point clouds (basically anything that isn't a static model) I recommend turning off adaptive homing. Some folks like it, but I prefer it off generally, and I set it to default off in the main TouchDesigner application preferences.

    • @liveperformance3608
      @liveperformance3608 2 роки тому +1

      @@TheInteractiveImmersiveHQ Thank you for this! Solved my issue :)

  • @thesuperh4992
    @thesuperh4992 7 місяців тому

    This is super cool! I have my point cloud looking awesome, but I was wondering what all I could do with it from here. I'd like to add some cool effects to it, but I'm unsure of where to start. I was thinking about adding a delay to where the squares have a little bit of lag getting to the points as it captures movement, any ideas on how I could make this happen?

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  7 місяців тому +1

      Thanks! 😀 You might try looking into something like the Cache TOP or Time Machine TOP for adding time delay effects to the point cloud TOP texture (before it's used in the Geometry COMP). Also check out our video Generative Point Clouds in TouchDesigner (ua-cam.com/video/__dHYGe9bQs/v-deo.html) for some additional inspiration/a look at some techniques for processing the data. Hope that helps!

    • @thesuperh4992
      @thesuperh4992 7 місяців тому

      @@TheInteractiveImmersiveHQ Ooooh thanks so much!!

  • @beanco5130
    @beanco5130 6 місяців тому

    Could I ask how you'd go about rigging up the color with just a regular old kinect 2? There isn't a konect2 select top...

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  6 місяців тому

      With the Kinect 2, you can just add additional Kinect TOPs to the network and set the Image parameter to the one that you need. Depending on the setup that you're working with, you might need to use the Camera Remap parameter to align depth image textures with color camera textures.

  • @RusticRaver
    @RusticRaver Рік тому

    Could you show us how to do close loop kinect scanning if it is possible in Touchdesigner, like in this video ; Large-scale real-time mapping example using Azure Kinect thx

    • @RusticRaver
      @RusticRaver Рік тому

      Actually I would not recommend buying a kinect Azure for anyone, I'm surprised nobody points out that it has terrible latency, actually worst than previous version, it is so bad it defeats the purpose of it, I sold it after a week. Maybe if they fix SDK it will be usable. serious rip off if you ask me.

  • @unnikrishnankalidas967
    @unnikrishnankalidas967 Рік тому

    Is it possible to connect two SDKs onto TD? was trying to make a similar project with two sensors for more detailed depth data.

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  Рік тому

      Great question! If you're using the Azure Kinect, you can connect as many to your computer as you have bandwidth for. If you're using the Kinect v2, however, you're limited to just one per computer. Hope that helps!

  • @medialuke2
    @medialuke2 2 роки тому +3

    Is this possible with the Intel Realsense? Or the old Kinect?

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  2 роки тому

      It absolutely is. A few of the operators might be slightly different, but as long as you have a point cloud texture (which the Kinect 2 does) you can do a similar process for sure.

  • @johnmike4914
    @johnmike4914 2 роки тому

    Hi I have a project I need help can I use 2 d415 real sense camera to make one point could mesh ?

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  2 роки тому

      You can, but you'll have to do the sensor fusion manually, unfortunately. Unless you find another app that does the combination for you or you do your best to line up the point clouds using something like the Point Transform TOP to manually orient everything together.

  • @marioscharalambous7907
    @marioscharalambous7907 Рік тому +1

    hello i would like to ask how i can put this in the unreal engine 5 with the owl plugin or something else?

  • @tiporight
    @tiporight 2 роки тому +1

    Great! is it possible to do the same with kinect v2?

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  2 роки тому +2

      Yup! A few operators might be slightly different, but as long as you get the point cloud texture from the Kinect 2, you can use almost the same setup as here.

    • @tiporight
      @tiporight 2 роки тому +1

      @@TheInteractiveImmersiveHQ could you recommend a tutorial for kinect2? thanks

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  2 роки тому

      @@tiporight Inside of The HQ PRO we have a full course about using Kinect 2 and the different ways you can use all the data (similar to this video). I'd recommend giving the free trial a try, I think you'll really enjoy it:
      interactiveimmersive.io/lp/hq-pro-full-trial/

  • @user-eu4wh3vt7f
    @user-eu4wh3vt7f 2 роки тому +1

    Tell us how to use Azure Kinect and calibrate the camera to project a Kinect image!))

  • @Jhj11755
    @Jhj11755 8 місяців тому

    Can you tell me what the salary range might be for touchdesigner career?

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  8 місяців тому

      This is dependent on a lot of factors, including location, experience, whether full-time or freelance, etc. It's worth checking out our blog (interactiveimmersive.io/blog/) for some materials on the topic or joining the Interactive & Immersive HQ PRO (interactiveimmersive.io/lp/hq-pro-full-trial/) to get assistance from industry pros to help you find the appropriate range

  • @lee_sung_studio
    @lee_sung_studio 11 місяців тому

    Thank you. 감사합니다.

  • @kkjoky
    @kkjoky 2 роки тому +1

    Yooo thats hard!!!

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  2 роки тому +1

      Haha I know right? How many developers does it take to make a colour point cloud? :)

  • @NickFidalgo
    @NickFidalgo 9 місяців тому

    How would I output from Geometry to an image file (png)? In addition, how could I have it save and overwrite that image file every x seconds?

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  9 місяців тому +1

      First, you'd need to add a rendering pipeline, which consists of a Camera COMP, Light COMP, some kind of material (a Phong MAT might work here) and a Render TOP. You'll need to make sure that you've assigned the MAT to the Geometry COMP, by dragging the MAT onto the Geo COMP's _Material_ parameter on the Render page.
      Once you've done this, you'll have a rendered view of the network output as a texture in the Render TOP, which you can then save/add post effects to/whatever else you might want to do with it.
      To save the texture to a file, you can use the Movie File Out TOP. Set the _Type_ parameter to Image, and then pick the file type you want via the _Image File Type_ parameter.
      To repeatedly save images after an interval of time, you can use the Timer CHOP. On the Timer page of the Timer CHOP's parameters, set the _Length_ parameter to the number of seconds you want the interval to be, and then turn the _Cycle_ parameter on and the _Cycle Limit_ parameter off. Finally, on the outputs page, turn _Cycle Pulse_ on.
      Then, make a CHOP reference from the newly added cycles_pulse channel to the _Add Frame_ button within the Movie File Out TOP, and it will save the image every time the timer finishes the particular interval! If the _Unique Suffix_ parameter in the Movie File Out TOP is turned off, the file will be overwritten each time, as the file name will stay the same. Hope that helps!

  • @PashkovGS
    @PashkovGS Рік тому

    Please tell me how to add a video texture to a point cloud if I have kinect 2 ?Thanks

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  Рік тому +1

      Sure, you'd follow the same approach described in the video, but instead of using the Kinect Azure/Kinect Azure Select TOPs, you can use the Kinect TOP and set the Image parameter to Depth Point Cloud or Color Point Cloud. Everything else should be the same from there. Hope that helps!

  • @drrobotsir
    @drrobotsir Рік тому

    How can we save the point cloud to a file, so it can be used by other apps such as Meshlab?

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  Рік тому +2

      Great question! To save the point cloud to a file, you can use the Movie File Out TOP. The Movie File Out TOP allows you to save .exr files using the OpenEXR format.
      To do this, connect the null1 TOP to a Movie File Out TOP. Then, in the Movie File Out set the Type parameter to Image, and the Image File Type parameter to OpenEXR. On the EXR parameter page, turn on the “Save As Point Cloud” setting. Below that, you have the ability to choose how you want to save the data from the null1 TOP.
      In the default configuration, you’ll only get the point positions and colors found within null1. However, you can also save the colors from null2 into the same .exr file by clicking the small “+” icon below the alpha parameter, which allows you to add additional data from a separate TOP. Drag the null2 TOP into the Additional Input TOP 1 parameter, and then make sure to rename the RGBA channels to something else, because channels with the same name will overwrite each other. I usually rename the first set of channels (directly below the “Save As Point Cloud” switch) to X, Y and Z, as I use them for position data, and then use the channels below Additional Input TOP 1 as R, G, B, and A, as I use them for colour.
      Back on the Move Out page, make sure you've set the file name you want to use (under the File parameter), and then click Add Frame to save the file. If you want to save a sequence of images instead, you can change the Type parameter from Image to Image Sequence, and then turn the Record switch on (it'll record an .exr for every frame until you turn the switch off).
      Hope that helps!

  • @yuan-dongzhuang5971
    @yuan-dongzhuang5971 7 місяців тому

    great!

  • @shanukagayashan
    @shanukagayashan 9 місяців тому

    is it possible read diameter of ring?

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  8 місяців тому

      Could you clarify what sort of ring you're looking to measure? Are you looking to take the measurement from the Azure's point cloud?

  • @JasonTopo
    @JasonTopo 2 роки тому +1

    This is not possible with a regular webcam right? 😅

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  2 роки тому +1

      Nope, because what the Kinect is doing is actually giving you all of the 3D information of the scene and then using that data to put little boxes all over the 3D environment and then colour them using it's normal RGB camera. A webcam on it's own only contains RGB information, but it doesn't have any 3D data streams in it.

  • @-303-
    @-303- 2 роки тому +1

    Azure rhymes with measure. “Azhurr”.

    • @TheInteractiveImmersiveHQ
      @TheInteractiveImmersiveHQ  2 роки тому

      Is that the official ruling? I feel like everyone says it a bit different haha, makes my life difficult!