📣 Consider becoming a Patreon and gaining access to today's Unity project and many others: www.patreon.com/dilmerv and GET MY “Full Source Code” Tier *** IMPORTANT INFO FOR URP SUPPORT *** I did additional testing and got URP to work. Take a look at this information if you're using URP in your project: 1- Be sure to have a scene model available. You will need to scan your area in beforehand. 2- You must install both packages below. The first one has dependencies needed to read stereo data: - Main Package: github.com/oculus-samples/Unity-DepthAPI.git?path=/Packages/com.meta.xr.depthapi - URP Package shaders: github.com/oculus-samples/Unity-DepthAPI.git?path=/Packages/com.meta.xr.depthapi.urp 3- I recommend using this project as I made sure to have all the settings correct: github.com/dilmerv/MetaDepthAPIWithURP 📌 Helpful resources including Depth API Guitar demos available below: developer.oculus.com/blog/mesh-depth-api-meta-quest-3-developers-mixed-reality/ github.com/oculus-samples/Unity-DepthAPI
Thank you very much for your content you´re always ahead. I'm developing for Quest 3 and my goal is Mixed Reality and your content is always amazing and deep. You´re my number resource for VR/MR keep the great work, from a brazilian fan.
Thanks for your feedback and congrats on your goal, mixed reality is going to be huge and having a part in the ecosystem will be amazing for you and many others who join right now.
Thank you for always good lectures. Everything else was understood and implemented From 15:10 in your video How did you get a real book and cover the front of the object I'm curious.
@@dilmerv thanks for your reply. Do you already plan to have a look into Mixed Reality Mulitplayer development with spatial ancors and point cloud share? For developing local multiplayer games? I really wonder this is already in the main SDK 57?
hey not sure if anyone else tried this tutorial yet? but it seems the toggler is put in the scripts folder, but then not put in the scene and configured? I saw in the video you dont do that, but somehow the toggler seems to work for you? I had to set it up with steps I figured out myself > make empty gameobject called toggler > drag toggler script onto it > in the inspector, in the occlusion toggler script, I selected the "Environment Depth Occlusion" node as the Occlusion Controller This seems to make this difference and it worked couldn't download the robot asset, seems the asset store is down for me today, anyone else?
Oops, thanks for the feedback Tom, I must've missed that during editing and cut that section by mistake, I appreciate your feedback to the community with steps, you're awesome!
Thank you Dilmer! Great tutorial as always! Bytheway I have an error after doing the imports of Depth API and Oculus Integration, the system tells me "XRDisplaySubsystem' does not contain a definition for 'GetRenderTexture' and no accessible extension method..." any clue about this?
This is so cool! Your channel is so underappreciated and deserves more subscriptions. Do you know if it works with a Quest Pro (from the documentation, it seems this only works on a Quest 3)?
Hey thanks for your nice comment! To answer your question, this feature only works with a Quest 3 because it is equipped with a depth camera, Quest Pro doesn’t have one and therefore can’t generate depth data needed to occlude virtual content. Good question!
Awesome video! I'm wondering if/how it's possible to place a object on top of the "real" table, right from the start. Am i able to get "access" to the informations about the furniture, made by the Quest user ?
Thank you - a video how to access this mesh informations - i.e. how to place a Eating Plate automatically on a Mesh marked as a Table - would be AWESOME @@dilmerv
Hi Dilmer, Thank you so much for the great tutorials. I am trying to inherit the Grabbable script which is inside the Package Manager but when I go to the definition of the Grabbable script it is inheriting from the metadata Grabbable script instead of the Package Manager Grabbable script. I need to access BeginTransform, UpdateTransform, and EndTransform methods from the Package Manager Grababble script to add some variables. I want to override the BeginTransform, UpdateTransform, and EndTransform methods from the Package Manager Grababble script into a custom script. Can you please help me with this? Thanks in advance.
Awesome content as always! Would love to see one using AR foundation, I am super interested in Image tracking for content placement. I find Anchors super tedious so Image tracking would be awesome to align my scene to the real world
Thanks for your great comment, currently Image tracking is not supported with a Quest 3, I did notice a QR Code scan feature when setting up the device so perhaps Meta will be exposing image tracking features in the near future.
@@dilmerv thanks for the reply! Well that's disappointing, I just assumed all the features from AR foundation would port over. Anchors it is for now then 😮💨
Thank you for another great tutorial. Byt I have oné question. Is there any option to access heart rate from Meta Move?? In some kind od game it´s accassable byt is ti also accassable in Unity? I mean any api etc.?? Thank you for you great work!!!
I got scene understanding to work with Oculus link but occlusion unfortunately is not working as of now, more likely Meta will add it later on. Good question!
the tutorial helped a lot while developing for mixed reality, thanks a bunch! But I witness a strange bug, while exporting the build via 'build and run' the occlusion works perfectly but starting it with the Quest on its own over unknown source, the gameobject with the depth shader isn't visible at all, does anyone know how to fix this? bit frustrating while you changed nothing in previous export steps.
Yes that’s normal, you need to connect the Quest device via USB-C and run the experimental command through adb as I showed in this video. Also it will reset back after restarting the headset so every time you restart you will need to run the command again. This is only while the feature is in experimental mode, once Meta promotes it to prod then you should be good.
Have you had a chance to compare the oculus integration tools with the new ar foundation mr template? With the vision pro coming out, I'd like to be able to reuse as much code as possible with ar foundation.
@@dilmerv hey I did a bunch of AR dev a few years ago and eventually understood all the different SDK options, now Im looking at VR/MR and so confused by all the different meta/openXR/AR foundation/MRTK variations, specifically with regards to quest develeopment there seems to be some degree of overlap. would love a video that talks about all of them and the pros and cons
Great tutorial, but despite this video was posted 2 weeks ago, Oculus Integration is now Deprecated. Now it changed to Meta XR Core and others, and it seems bugged
I tried the demo project with URP and deploy it just fine but didn’t build a project from scratch with URP, let me do that very quick and let you know.
Ok I did additional testing and got it to work. Take a look at this information for URP: 1- Be sure to have a scene model available. You will need to scan your area in beforehand. 2- You must install both packages below. The first one has dependencies needed to read stereo data: - Main Package: github.com/oculus-samples/Unity-DepthAPI.git?path=/Packages/com.meta.xr.depthapi - URP Package shaders: github.com/oculus-samples/Unity-DepthAPI.git?path=/Packages/com.meta.xr.depthapi.urp 3- I recommend using this project as I made sure to have all the settings correct: github.com/dilmerv/MetaDepthAPIWithURP Let me know if you have additional questions.
Thanks for the reply@@dilmerv I'm on PC, Unity 2022.3.11f1, Oculus XR Plugin 4.2.0-exp-env-depth.1, Universal RP 14.0.9, Oculus Integration 57.0 and I installed both Main Package and URP Package shaders.
have you had issues with the app crashing? my quest 3 asks for the room set up then after the "made with Unity" logo, the app crashes. Any thoughts? exp features are enabled. Thanks!!!
Interesting James, is it crashing with a simple app / game, or are you running a basic demo? I am wondering how many objects have occlusion? It may be good to go over the logcat logs to find out why it crashed.
Good point, I did have two objects with the shader on. I wonder if it didn't like the shader on that game object; I didn't follow your scene and didnt use the robot demo etc. I will try again and see what happens. Thanks for the time! @@dilmerv
actually the objects turn transparent after i press play on the editor and it stays transparent after that, and i get these 2 warnings: - [OVRSceneModelLoader] LoadSceneModel failed. No link or HMD detected. - DepthAPI: no environment texture when i build the app i have these warning, it seems some conflict between vulkan and directX maybe? - Shader warning in 'Meta/Depth/BiRP/Occlusion Standard': use of potentially uninitialized variable (CalculateEnvironmentDepthSoftOcclusion_Internal) at Project/Library/PackageCache/com.meta.xr.depthapi@13b6d11e6a/Runtime/Core/Shaders/EnvironmentOcclusion.cginc(75) (on d3d11) -Shader warning in 'Meta/Depth/BiRP/Occlusion Standard': 'i': loop control variable conflicts with a previous declaration in the outer scope; most recent declaration will be used at Project/Library/PackageCache/com.meta.xr.depthapi@13b6d11e6a/Runtime/Core/Shaders/EnvironmentOcclusion.cginc(78) (on d3d11) for either vulkan and d3d11 @@dilmerv
One of the thing I encountered while trying to test the Depth API is that the new Meta XR All-in-One SDK is automatically moved to Packages instead of Assets. This isn't the biggest issue, but then all the materials within that packages are pink and no matter how I try to implement them via Universal Render Pipeline, it always automatically goes back to pink... Also the Meta/Occlusion does not exist in the new Meta XR All-In-One... I'd love to know if anyone found a solution to that
@@dilmerv can't wait! Looking forward to your next video :)) Also, it does seem like there are some features that Oculus Integration has and not the Meta UPM all-in-one SDK. I wonder if you still need to install the Oculus Integration v47 so it can work with Meta all-in-one... Very grutuitious, but I guess there's not much an option...
Been waiting for this since 2017... 😀. Problem now is buying the Quest3 or wait for competitors like Apple, Samsung... This tech is not yet inexpensive...
It is definitely a big investment, Apple will never be affordable, but yes a bit more accessible, but the price Meta offers with Quest 3 today is honestly unbeatable, it may sound a bit expensive but they are probably losing money today with hardware at $499.
I agree with Dilmer on this one. The Quest lineup of devices is shockingly affordable when you factor in the cost of the hardware onboard the device. Quest 3 does 5x what an Xbox SeriesX can do for the same price. Also I don’t expect Apple, Samsung or any other competitors having a cheaper option anytime soon.
📣 Consider becoming a Patreon and gaining access to today's Unity project and many others:
www.patreon.com/dilmerv and GET MY “Full Source Code” Tier
*** IMPORTANT INFO FOR URP SUPPORT ***
I did additional testing and got URP to work. Take a look at this information if you're using URP in your project:
1- Be sure to have a scene model available. You will need to scan your area in beforehand.
2- You must install both packages below. The first one has dependencies needed to read stereo data:
- Main Package: github.com/oculus-samples/Unity-DepthAPI.git?path=/Packages/com.meta.xr.depthapi
- URP Package shaders: github.com/oculus-samples/Unity-DepthAPI.git?path=/Packages/com.meta.xr.depthapi.urp
3- I recommend using this project as I made sure to have all the settings correct:
github.com/dilmerv/MetaDepthAPIWithURP
📌 Helpful resources including Depth API Guitar demos available below:
developer.oculus.com/blog/mesh-depth-api-meta-quest-3-developers-mixed-reality/
github.com/oculus-samples/Unity-DepthAPI
You are amazing bro..! We want more videos for quest 3..!!
Thanks a lot my friend, working on a new one right now 😉🔥
Thank you very much for your content you´re always ahead. I'm developing for Quest 3 and my goal is Mixed Reality and your content is always amazing and deep. You´re my number resource for VR/MR keep the great work, from a brazilian fan.
Thanks for your feedback and congrats on your goal, mixed reality is going to be huge and having a part in the ecosystem will be amazing for you and many others who join right now.
Thank you for always good lectures.
Everything else was understood and implemented
From 15:10 in your video
How did you get a real book and cover the front of the object
I'm curious.
Great, expecting more Depth API tutorials!
There will be more for sure, thanks for your feedback!
Now you got me! Waiting for this to get into dev of mr. Thank you so much.
I know the feeling! Thanks for sharing your excitement and support!
@@dilmerv thanks for your reply. Do you already plan to have a look into Mixed Reality Mulitplayer development with spatial ancors and point cloud share? For developing local multiplayer games? I really wonder this is already in the main SDK 57?
by the way i will start my pledge on patreon next weekend, cause you're an absolut perfect source for my current interests. Thank you so much.
hey not sure if anyone else tried this tutorial yet? but it seems the toggler is put in the scripts folder, but then not put in the scene and configured?
I saw in the video you dont do that, but somehow the toggler seems to work for you? I had to set it up with steps I figured out myself
> make empty gameobject called toggler
> drag toggler script onto it
> in the inspector, in the occlusion toggler script, I selected the "Environment Depth Occlusion" node as the Occlusion Controller
This seems to make this difference and it worked
couldn't download the robot asset, seems the asset store is down for me today, anyone else?
Oops, thanks for the feedback Tom, I must've missed that during editing and cut that section by mistake, I appreciate your feedback to the community with steps, you're awesome!
@@dilmerv no worries, it feels good to contribute something after so much take take take from your channel 👊🏻
Thank you!
hey man,
Thank you for the hard work. I really learned a lot from you.
Thanks a lot man, I really appreciate your support and feedback!
Is this OVR only or can we use it in OpenXR?
That was impressive!
Thanks for your feedback 🙏
Thank you Dilmer! Great tutorial as always! Bytheway I have an error after doing the imports of Depth API and Oculus Integration, the system tells me "XRDisplaySubsystem' does not contain a definition for 'GetRenderTexture' and no accessible extension method..." any clue about this?
Hey there thanks for your comment! What version of Unity and Oculus integration are you installing?
This is so cool! Your channel is so underappreciated and deserves more subscriptions. Do you know if it works with a Quest Pro (from the documentation, it seems this only works on a Quest 3)?
Hey thanks for your nice comment! To answer your question, this feature only works with a Quest 3 because it is equipped with a depth camera, Quest Pro doesn’t have one and therefore can’t generate depth data needed to occlude virtual content. Good question!
great sample!
Glad you liked it, thanks for your feedback!
This is sick!!! Damn you for making me want to get a q3 😂
Awesome then I am doing my job 😃 it is a very cool device, thanks for watching Sam!
Muy bueno! gracias por compartir. excelente info
Gracias a ti por tu tiempo y excelente comentario, saludos!
Finally realtime depth for VR!)
Couldn’t agree more and I share your excitement 🥳🔥
Awesome video!
I'm wondering if/how it's possible to place a object on top of the "real" table, right from the start. Am i able to get "access" to the informations about the furniture, made by the Quest user ?
Yes you can access any of the mesh information from the start as long as you setup the scene model which is required for this experience.
Thank you - a video how to access this mesh informations - i.e. how to place a Eating Plate automatically on a Mesh marked as a Table - would be AWESOME @@dilmerv
Hi Dilmer, Thank you so much for the great tutorials. I am trying to inherit the Grabbable script which is inside the Package Manager but when I go to the definition of the Grabbable script it is inheriting from the metadata Grabbable script instead of the Package Manager Grabbable script. I need to access BeginTransform, UpdateTransform, and EndTransform methods from the Package Manager Grababble script to add some variables.
I want to override the BeginTransform, UpdateTransform, and EndTransform methods from the Package Manager Grababble script into a custom script.
Can you please help me with this?
Thanks in advance.
Does quest 3 supports post processing in passthrough?
Awesome content as always! Would love to see one using AR foundation, I am super interested in Image tracking for content placement. I find Anchors super tedious so Image tracking would be awesome to align my scene to the real world
Thanks for your great comment, currently Image tracking is not supported with a Quest 3, I did notice a QR Code scan feature when setting up the device so perhaps Meta will be exposing image tracking features in the near future.
@@dilmerv thanks for the reply! Well that's disappointing, I just assumed all the features from AR foundation would port over. Anchors it is for now then 😮💨
Hi there, thanks for the video! Is it possible to access the created mesh directly and use for other purposes? Like at 14:44.
Great question, I am not 100% sure but I will look into it and let you know. Thanks for your feedback!
Thank you for another great tutorial. Byt I have oné question.
Is there any option to access heart rate from Meta Move?? In some kind od game it´s accassable byt is ti also accassable in Unity? I mean any api etc.??
Thank you for you great work!!!
Interesting question and to be honest I am not sure. I will look into this more and let you and the community know. Thanks for your time!
Amazing
Thank you 🙏
Great, thanks a lot for the tutorial :) I have one question, is the Depth API also working when running the app in Unity Editor using quest link?
I got scene understanding to work with Oculus link but occlusion unfortunately is not working as of now, more likely Meta will add it later on. Good question!
the tutorial helped a lot while developing for mixed reality, thanks a bunch! But I witness a strange bug, while exporting the build via 'build and run' the occlusion works perfectly but starting it with the Quest on its own over unknown source, the gameobject with the depth shader isn't visible at all, does anyone know how to fix this? bit frustrating while you changed nothing in previous export steps.
Yes that’s normal, you need to connect the Quest device via USB-C and run the experimental command through adb as I showed in this video. Also it will reset back after restarting the headset so every time you restart you will need to run the command again. This is only while the feature is in experimental mode, once Meta promotes it to prod then you should be good.
@@dilmerv thank you so much for your fast reply! Then I hope Meta will promote it soon 😅
have a nice day!
Have you had a chance to compare the oculus integration tools with the new ar foundation mr template? With the vision pro coming out, I'd like to be able to reuse as much code as possible with ar foundation.
I haven’t yet but I will review it this weekend and possibly do a video about my experience, thanks for your suggestion and feedback!
@@dilmerv hey I did a bunch of AR dev a few years ago and eventually understood all the different SDK options, now Im looking at VR/MR and so confused by all the different meta/openXR/AR foundation/MRTK variations, specifically with regards to quest develeopment there seems to be some degree of overlap. would love a video that talks about all of them and the pros and cons
Great tutorial, but despite this video was posted 2 weeks ago, Oculus Integration is now Deprecated. Now it changed to Meta XR Core and others, and it seems bugged
Thanks and yes you are right they changed packages right after I published! But no worries I will update this video soon!
Hi , is Image tracking and model trcaking possible in meta Quest 3 for unity
Hey great question but currently that is not an option. I will keep everyone updated if Meta changes that with future versions.
thanks bro.
Thank you for watching 🙏
Did you try implementing using URP?
When I try, unity tells me there's multiple errors in the shaders and objects turn invisible when using them
I tried the demo project with URP and deploy it just fine but didn’t build a project from scratch with URP, let me do that very quick and let you know.
Also, what version of Unity did you use and what computer type (PC or Mac) ?
Ok I did additional testing and got it to work. Take a look at this information for URP:
1- Be sure to have a scene model available. You will need to scan your area in beforehand.
2- You must install both packages below. The first one has dependencies needed to read stereo data:
- Main Package: github.com/oculus-samples/Unity-DepthAPI.git?path=/Packages/com.meta.xr.depthapi
- URP Package shaders: github.com/oculus-samples/Unity-DepthAPI.git?path=/Packages/com.meta.xr.depthapi.urp
3- I recommend using this project as I made sure to have all the settings correct: github.com/dilmerv/MetaDepthAPIWithURP
Let me know if you have additional questions.
Thanks for the reply@@dilmerv I'm on PC, Unity 2022.3.11f1, Oculus XR Plugin 4.2.0-exp-env-depth.1, Universal RP 14.0.9, Oculus Integration 57.0 and I installed both Main Package and URP Package shaders.
@@dilmerv I found that pressing the Play button in Unity editer causes the problem. It works as long as I don't press that
have you had issues with the app crashing? my quest 3 asks for the room set up then after the "made with Unity" logo, the app crashes. Any thoughts? exp features are enabled. Thanks!!!
Interesting James, is it crashing with a simple app / game, or are you running a basic demo? I am wondering how many objects have occlusion? It may be good to go over the logcat logs to find out why it crashed.
Good point, I did have two objects with the shader on. I wonder if it didn't like the shader on that game object; I didn't follow your scene and didnt use the robot demo etc. I will try again and see what happens. Thanks for the time!
@@dilmerv
When i use the depth API shader on a material it makes the object turn transparent, any suggestion?
What rendering pipeline are you using and what's the shader name of the shader you added to the material?
I'm using the built-in render pipeline and the shader is the occlusion standard shader from meta @@dilmerv
actually the objects turn transparent after i press play on the editor and it stays transparent after that, and i get these 2 warnings:
- [OVRSceneModelLoader] LoadSceneModel failed. No link or HMD detected.
- DepthAPI: no environment texture
when i build the app i have these warning, it seems some conflict between vulkan and directX maybe?
- Shader warning in 'Meta/Depth/BiRP/Occlusion Standard': use of potentially uninitialized variable (CalculateEnvironmentDepthSoftOcclusion_Internal) at Project/Library/PackageCache/com.meta.xr.depthapi@13b6d11e6a/Runtime/Core/Shaders/EnvironmentOcclusion.cginc(75) (on d3d11)
-Shader warning in 'Meta/Depth/BiRP/Occlusion Standard': 'i': loop control variable conflicts with a previous declaration in the outer scope; most recent declaration will be used at Project/Library/PackageCache/com.meta.xr.depthapi@13b6d11e6a/Runtime/Core/Shaders/EnvironmentOcclusion.cginc(78) (on d3d11)
for either vulkan and d3d11
@@dilmerv
One of the thing I encountered while trying to test the Depth API is that the new Meta XR All-in-One SDK is automatically moved to Packages instead of Assets. This isn't the biggest issue, but then all the materials within that packages are pink and no matter how I try to implement them via Universal Render Pipeline, it always automatically goes back to pink... Also the Meta/Occlusion does not exist in the new Meta XR All-In-One... I'd love to know if anyone found a solution to that
Thanks for the info, I will cover this in my next video.
@@dilmerv can't wait! Looking forward to your next video :)) Also, it does seem like there are some features that Oculus Integration has and not the Meta UPM all-in-one SDK. I wonder if you still need to install the Oculus Integration v47 so it can work with Meta all-in-one... Very grutuitious, but I guess there's not much an option...
Perfect! can you do one using unreal?
Thanks for your feedback I will look into it!
Been waiting for this since 2017... 😀. Problem now is buying the Quest3 or wait for competitors like Apple, Samsung... This tech is not yet inexpensive...
It is definitely a big investment, Apple will never be affordable, but yes a bit more accessible, but the price Meta offers with Quest 3 today is honestly unbeatable, it may sound a bit expensive but they are probably losing money today with hardware at $499.
I agree with Dilmer on this one. The Quest lineup of devices is shockingly affordable when you factor in the cost of the hardware onboard the device. Quest 3 does 5x what an Xbox SeriesX can do for the same price.
Also I don’t expect Apple, Samsung or any other competitors having a cheaper option anytime soon.
If you are a mixed reality dev, buy it now!! If you are a gamer then quest 2 is maybe a good buy
❤❤❤❤
Thank you 🙏 I am glad you liked it!
🎉 This is very nice 👌
I'm building a similar project and I would like to collaborate and learn
Thanks for your feedback and have fun with your project 🙏
Oculus Integration deprecated today :O New Meta full suite launched!
Thanks for the info!