Ah the good old days, that video made me subscribe to you. Btw have you tried out the openCV OAK-D camera? I was really impressed with the depth data it generates with stereoscopy.
I have not. I contacted the company a while ago to ask about API support etc., and they basically said "we are not interested in your use case." So that's that.
I saw that feature for Kinect for Azure. In this case, though, it's the two cameras inside a single Kinect (color and depth) that are out of synchronization, causing the relative shift between color texture and 3D geometry.
0:35 how could we knock on someone with the moves of a motion capture actor? Really though, this video makes me think of some kind of trailer for kinect or Bomb Rush Cyberpunk with that dancing, the capture is that good
Hello! I’m a VR/MoCap/AR etc novice. Can you please explain how you set up your software? I downloaded it from your website but am unsure of how to proceed. I simply would like to recreate the video you have posted here. What is the easiest way to do so? I have a Kinect V2, windows computer, etc. I just don’t know how to run the software. Thank you so much!
There are detailed instructions to install the software here: web.cs.ucdavis.edu/~okreylos/ResDev/SARndbox/LinkSoftwareInstallation.html Those instructions are for the AR Sandbox, but you only need the Vrui and Kinect packages to recreate this video, so you can stop after installing those two packages. Note that the software only works on Linux, not on Windows.
Always a pleasure watching your Kinect magic 🙂
Thank you, but dangit, this was supposed to be a "short," hence the vertical format, but now UA-cam refuses to treat it as such. Ugh.
Been following you for years after finding the first Kinect 3D video cap. Keep up the good work.
Ah the good old days, that video made me subscribe to you.
Btw have you tried out the openCV OAK-D camera? I was really impressed with the depth data it generates with stereoscopy.
I have not. I contacted the company a while ago to ask about API support etc., and they basically said "we are not interested in your use case." So that's that.
@@okreylos oh. What a shame.
I believe The way to sync the cameras is with 3.5 audio cable
I saw that feature for Kinect for Azure. In this case, though, it's the two cameras inside a single Kinect (color and depth) that are out of synchronization, causing the relative shift between color texture and 3D geometry.
0:35 how could we knock on someone with the moves of a motion capture actor?
Really though, this video makes me think of some kind of trailer for kinect or Bomb Rush Cyberpunk with that dancing, the capture is that good
Hello! I’m a VR/MoCap/AR etc novice. Can you please explain how you set up your software? I downloaded it from your website but am unsure of how to proceed. I simply would like to recreate the video you have posted here. What is the easiest way to do so? I have a Kinect V2, windows computer, etc. I just don’t know how to run the software. Thank you so much!
There are detailed instructions to install the software here: web.cs.ucdavis.edu/~okreylos/ResDev/SARndbox/LinkSoftwareInstallation.html
Those instructions are for the AR Sandbox, but you only need the Vrui and Kinect packages to recreate this video, so you can stop after installing those two packages.
Note that the software only works on Linux, not on Windows.
Suits this work with Azure Kinects?
In principle yes, but I haven't tested those yet, so there is no driver support for them in the software.