How To Make a Quest 3 Mixed Reality Game - PART 3 - Depth
Вставка
- Опубліковано 29 вер 2024
- Welcome back to the third part of this tutorial series about Quest 3! In this video we are going to fix one of XR biggest issue : the depth !
❤️ Support on Patreon : / valemvr
🔔 Subscribe for more Unity Tutorials : www.youtube.co...
🌍 Discord : / discord
🐦Twitter : va...
👍 Main Channel : / @valemvr
🔥 Tiktok : / valemxr
Github Meta Depth API : github.com/ocu...
····················································································
❗❗❗ WISHLIST MY FIRST VR GAME ON STEAM ❗❗❗
store.steampow...
····················································································
📝Best VR Assets ( These links are Affiliate links which means it costs the same to you and I get a small commission. Thanks for your support!)
VR Interaction Framework
assetstore.uni...
Hexabody (Physics VR Player)
assetstore.uni...
Mirror and Reflection for VR
assetstore.uni...
Auto Hand (automatic hand grab pose)
assetstore.uni...
Hurricane VR (vr physics interaction)
assetstore.uni...
3d Hand Model for VR
assetstore.uni...
····················································································
Full Series on How to make a VR game • How to Make a VR Game ...
····················································································
If you enjoyed this video, here are some other really good channel you should appreciate :
⌨️ Game Dev
Brackeys : / @brackeys
Dani : / @danidev
Code Monkey : / @codemonkeyunity
👨🎓 Vr Dev
Justin P Barnett : / @justinpbarnett
Vr with Andrew : / @vrwithandrew
····················································································
#vr #vrdev #madewithunity
for anyone out there that has a problem where the passthrough mesh is only appearing in one eye you may need to go to edit>project settings>oculus go to the pc tab and change the stereo rendering mode from single pass to multi pass
Bonjour, acquérir la logique imposée par la structure des différents assets pour la VR est complexe en isolé ........ sur le conversationnel, je vous remercie pour la logique de construction des différents plug in.
Hi, would like to ask when i make this for quest pro, hand passthrough depth works but i can still see the ball spawned behind the furniture and walls ive set
Brilliant tutorial again, just a quick question when i build and run the game for quest 3 it just goes to my quest pc library on computer, so still got to have link cable connected to play, is there a way to get it into the quest 3 headset library to play without computer connected
guess you made the same mistake as me, go into build settings before choosing build, then you have to change from PC to Android, (he shows it in the first movie) then it is saved directly on the quest when you have it connected and choose build&run, (or patch &run if you run revisions)
Approx 5:20 out in 1of 3 tetorials
@@JIOmland ive got it set as android but game seems to be on oculus app on pc so can play with quest link cable connected to pc, but wanted it on the quest itself so dont need pc connected to play
@@JIOmland finally got it when clicking on search bar in apps then clicking unknown sources and app was there now can run app while not connected to computer, but will delete app from quest headset and do it again tomoro to make sure i got it working as build and ran with usb connected but not enabling quest link and they tried with it connected
Hey Quentin!
Thanks for the great tutorials. I have a few questions:
1. Is it possible to use information about the real environment (light, reflections) to influence 3D objects in a scene?
2. If the answer to the first question is no), is it possible to cast shadows from Unity created light sources on the global geometry created by Quest 3?
2:43 If using Meta All-in-One SDK, Material Shader might be in Oculus → SelectivePassthrough ?
If anyone get this error in their app: `DepthAPI: no environment texture`, then you have probably restarted your headset. Make sure to enable experimental features by running this command in your terminal: `adb shell setprop debug.oculus.experimentalEnabled 1`
Amazing. Could you do a tutorial on about how to dynamically place objects in Mixed Reality? It would be very interesting to have some input on these things
- As every room scan is different how can I make sure that e.g. an object is always on the wall or in a specific corner of the room?
- How can I avoid placing an object in e.g. a corner if there is already a real life object and instead place it on top or in some other part of the room?
Thank you and keep up the good work!
Figured this out! OK I wanted to place a cube on the table top of my desk. Attach a script to the cube in question. Create a OVRSceneManager and call it's method 'GetSceneAnchors' (you need to supply an empty list of OVRSceneAnchors). This will get you all the anchors the user set up in scene selection.
Next, go through the list of anchors and look for whatever semantic you want ('table' for me). Then move the cubes transform (transform == where the object is I think) to that of the anchor. I got GPT4 to help me with this btw.
I now have a cube that renders itself on the desk in front of me.
Code example? Got same issue@@AIandsuch
Specifically looking for a semantic
@@AIandsuch I'm a bit confused reading this. Can you give me an example of the implementation?
You can get all this information here by using the Mixed Reality Utility Kit from Meta: ua-cam.com/video/n6YZlp4yMwM/v-deo.html&t
Incredible. Exactly why I'm subscribed.
Does anyone know what it means if the MixedReality/SelectivePassthrough material occlusion works in the left eye but not the right?
If anyone has the same issue opening Project Settings > XR Plug-In Management > Oculus > Stereo Rendering Mode > Set to 'Multi Pass'. This will allow it to work but does not seem like a good idea for performance down the line though.
@@TheRealKingOfCringe I have same problem today, but your fix does not work
in the oculus link only 1 eye has occlusion working, even in the good eye (left eye) there is some kind of wireframe artefact from the global mesh
if I build to device, everything in the scene is black
EDIT: Im using URP not sure if that is a possible problem?
While I appreciate the tutorials, if you want to build a larger audience quickly, it is probably best to try to answer questions we have in the comments. Only 70 or so comments on this, so it should not be that hard. The fact of the matter is this part does not work with the newer meta xr package. I have tried 3 different selective passthrough shaders, neither of them work properly(they all fail to show the hand in front of the cube). I have played with settings etc, nothing works. I have confirmed that hand gestures etc work fine, so it is active, it is just the shader that fails... I really don't see the point in trying to go through a tutorial with a deprecated sdk...
Did you find solution?
make new unity project -> install old oculus integration sdk (deprecated) -> copy the whole folder /Assets/Oculus/SampleFramework/Usage/SceneManager to your current unity project in order to have the right prefabs AND!!! scripts. You will find all necessary shaders and prefabs which arent in the current meta package@@wolfrowhy
@@wolfrowhy I'm afraid not, but I ran out of patience pretty quickly in fairness.
I gave up with Depth API, with the latest version I couldnt follow the tutorial and I almost broke my previous work, waiting for a safer update.
Any updates regarding the Depth API for the v60 SDK?
The documentation hasn't been updated yet on meta's website
If you imported only the prefabs from the deprecated package, you don't be able to use the mesh collider prefab because you haven't imported the associated script, it will say "SAVING FAILED". You need to import the setMeshBarycentricCoordinates script and it will work :)
Your channel has so many high quality tutorials, it is insane! Thank you! Could you make a MR portal tutorial?
Next video how to add Stars in room?
These videos are amazing, save me so much time to not have to research all this on my own!
Can image recognition be achieved through pass-through video streaming and WebSockets data transmission?
When i use the depth API shader on a material it makes the object turn transparent, any suggestion?
please continue the tutorial, I want to be able to move the cubes (preferably in x/z direction (with joystick on the controller) and let them follow the floor plane otherwise,)
Best tutorial for quest 3 ❤
this tut doesn't work im trying to add an environment depth occlusion but it won't show up -_-
10:30
Tu fais quand des vidéos en français ?:) j’aime beaucoup tes vidéos mais ton accent me fatigue sur les longues sessions .
first method doesn't work, hand tracking occludes nothing
Awesome tutorial! Following the tutorial, the cube is always rendered on top even when I move my hands to the front. I'm using Meta XR All-in-One SDK v60.0.0 with the SampleFramework imported from the deprecated Oculus Integration package. Any ideas how to fix this?
did you ever figure this out? running into the problem now
It seems that the "VertexFadedPassthrough" shader kind of works when running with the Quest link, but not after the build. I guess that means the shader does not work after compiling.
Meanwhile, I said it "kind of works" because it works for the left-eye image but not for the right-eye image.
Keep them coming!! We need more devs for this tech!
Great Tutorial. I'm running into a couple of issues since the Depth API updated to 0.1.1 in the last day or so.
* Depth tracking on Hands no longer working, 3D objects render over hands
* Bogus 'Android SDK not found' alerts on Build & Run to Quest 3
Has anybody else run into this, or figured out the solution?
I'm also trying to use Depth API 0.1.1 .
I don't have those specific problems but, I can't get any of the objects to render with the Meta Occlusion Starndard shader.
And I had to "undelet" the OVRPlugin file to get some of the errors to dissapere.
Played around a bit and got it working now after doing this:
1) Have my Quest connected to the PC with USB cable and do the "Build And Run" in Build Settings (might have to choose Oculus Quest 3 under "Run Device").
I then get prompted to "Enable experimental".
2) Then do "Build" from Build Settings.
3) Install the apk on my Quest (I use SideQuest for that).
Now it works at least with the occlusion!
I think I have to double check if I did everything right because I can't get the balls to bounce on doors for example, but they are at least occluded.
I am still having the same issue that any objects render with the Meta Occlusion Starndard shader are not rendered. I did "Build and Run" in Build Settings, but didn't get prompted to "Enable experimental". @@ZpeedTube
Is there a way to get depth with AR Foundation?
Its been a while since I wait anxiously for new UA-cam series videos! Keep up the good work!!
This helps so much! I could not really figure this out on my own, and getting that depth api working for me would've taken hours, this video explains everything
If anyone has trouble with the passthrough hands displaying black, make sure you have a passthrough building block in scene. Also, it seems like you can only have one in the entire project, so have it Don'tDestroyOnLoad and persist through all scenes and be loaded in the first scene. If you want passthrough hands, but not passthrough skybox, just change the center camera's masking to skybox and NOT color.
Hoping you do a tutorial how to place objects like on the walls etc
no one with the same problem?
Good tutorial, you are doing a long term investment, now your videos won't get many views but when VR will be much more mainstream your channel will be set already
I'm loving these tutorials, maybe in another one you can explain how to make it possible to grab objects, and interact with ui?
He already has a tutorial on that
Which video is that @@TheSirflanny ?
Hello! How can we make a hole on the wall like the game First encounters?
Hey i covered this effect in two of my patreon exclusive tutorial, the technique used a stencil renderer feature that allow to see the 3d model through a mask
For Anyone trying to figure out how to Make Objects Interactable/Grabbable... Here you go : ua-cam.com/video/ltrHheBIHRo/v-deo.html
Hello,
I've been quite intrigued by the implementation of MR in the Quest 3, particularly in the "First Encounters" project. In this project, the game environment is displayed after scanning a space. I have a specific question: How is it possible to create an effect where one wall is virtual, and when touched or shot with a gun in the game, it reveals a virtual wall? I'm very curious about the technical aspects of how this effect is achieved in the "First Encounters" project. I would appreciate any insights or explanations you can provide on this matter.
Thank you very much.
Hi, could you make a video about using open xr double tap in Action Properties , thanks in advance.
the first two parts of this MR tutorial series i can get to work, even though by now Oculus Integration is deprecated. but this part fails me. I do everything within the first four minutes of this tutorial but the cube disappears when I build and run it.
it's admirable to be making tutorials showing others how to build VR content, it seems like the big companies have largely given up making VR games... I just ordered a Bigscreen Beyond VR headset for the primary purpose of playing 'flat 2D games converted to 3D' through VorpX or Depth3D
May I know if PC VR can also call up the background MESH and real-time depth scanning data when using QUEST 3?
Can you track if real objects move somehow, or can you get real-time collisions with ray casting and the depth sensor so that when things move like a door the collisions will still be correct?
Megnifishant had me laughing, nice job expanding your vocabulary tho and great tutorials
Hi, I'm sure that I didn't miss any step, and the material didn't work. I can see my hand when it is behind the cube QQ... I don't know how to solve that
Thanks a bunch for your tutorials and hard work! sooo helpful starting with building vr applications!
Tested the integration last days and Meta already decided to let the OVR Integration become obsolete, but it still works fine :)
Furthermore is there a chance you could make a tutorial about trigger option in reallife? So that the quest realizes to trigger sth. in VR while looking (maybe with eye tracking) on sth. in real life through the AR- option?
Hello! Amazing series! Can you make a video on how to use nav mesh agents in the Mixed reality?
Hey so if I understand correctly-- IF I use the depth sensor and the api from meta, I can create meshes that can be "another room" beyond a doorway, as long as the material has the occulsion shader? Thank you!
I cant add the scripts Environment depth. got error: Library\PackageCache\com.meta.xr.depthapi@5e07b8e7a7\Runtime\Core\Scripts\EnvironmentDepthTextureProvider.cs(109,64): error CS0122: 'OVRPermissionsRequester' is inaccessible due to its protection level. DepthApi 60. Everything works fine until the depthapi install
Got the same problem. Did you found a solution?
Did you try implementing depth api using URP?
When I try, unity tells me there's multiple errors in the shaders and objects turn invisible when using them
There is a different setup needed. ua-cam.com/video/mk6UYMaHZOo/v-deo.html
Awesome video, are you gonna make a video about the SceneManager? For me the demo with the Furniture works, but when I try to use custom objects, it doesnt work
Is it possible to automatically track a 3d object to an enviroment element like a table/chair? Like pianovision
Anyone else exepriencing Unity crash after pressing play in 50% of the cases?
Amazing man, Dilmer videos are short but always full of bugs or stuff he doesnt talk about, what I like about your vids is at the end I got a lot of new skills, thank you!
Im trying to work out how to create a window into a game world for a light gun inspired idea.
This is all awesome! You're a lifesaver :) Thank you!
Hi! More mixed reality video tutorials!
Great tutorial! Thank you so much!
Awesome man! Thanks for sharing it!
Awesome! Continue with the videos
Thank you for the video, and your dedication to us that you made such a mess in your home with all of those red balls, to show us this tech
It took so long to clean up 😀
Thanks this was perfect
Where is my name
Love the videos so far.
amazing
Awesome but I can't get my github link to work in Unity
At the top of the page from the link in the description, click the Green Code button at the top right, then click Download ZIP, extract it, then in Unity, click the plus button again, but click "Add package from Disk" then navigate to that extracted folder, open Package folder, choose your version (I don't use URP), then double click the package.json file, and it should install it. Good luck!
@@dustinjordan9054 you saved my pc from my throwing outside the fcking window