📢VERY IMPORTANT: Be sure to watch these videos in chronological order: - To get the Xcode, visionOS platform, visionOS simulator tools setup watch this video 👉ua-cam.com/video/LeqVHfqRq_I/v-deo.html - To get Unity PolySpatial packages setup including core visionOS platform watch this video 👉ua-cam.com/video/EtPaYKvzs6M/v-deo.html
If anyone got situation when editor input works, but simulator's doesn't: Project settings -> Tab for Win, Mac, Linux -> Check "Run In Background checkbox". It actually works, editor just stops playing when it loses focus.
Fantastic tutorial, for anyone who wants to try this out be careful play to device currently doesn't support ARKit features, so hand tracking and world meshing are not available. To do that you still need to build xcode project in the old way.
Thanks for the great video! And I have a question if I want to get information about the room in the simulator like a table or a sofa what do I do? I want to place my model on a table or where I want it to be.
That’s a great question, unfortunately plane detection or meshing doesn’t work with the simulator, this is something you can implement but you would need a physical device to test. Another option would be to use Unity XR simulation tools which support plane and meshing.
Awesome, try this: - Go to File > Build Settings > Player Settings - Then XR Plug-in Management > Apple visionOS - Verify that "Hands Tracking Usage Description" is set - Close the Play To Device App and Re-Open it in your physical device - Then hit play in Unity again Let me know if that works, I just tested it in a test project and it is working with what I have.
What are the steps leading up to 18:31? it seems like the beginning steps got cut off. Right now, my "XR Environment" menu is grayed out, I cannot click anything
Hey thanks for your question, go to Player Settings > and then under the standalone platform enable XR Simulation, I am not close to my computer but that should make it so that it becomes enabled.
I don’t believe that’s possible, I think in swift you can open one at the time but not multiple at the time. The sim can open multiple but each volume is its own application.
Does the Play to Device feature work in unbounded scenes? When I run the sample unbounded scenes, they give an error about Hands Tracking. If I build it first and then run on the Apple Vision Pro with xcode, it works.
Honestly the latest Play To Device version is super buggy, I had so many problems, when I recorded this video everything was working well but it seems that new updates made by Unity introduced a lot of issues. I will check and let you know but I did get the same error you are getting very often.
@@dilmerv thanks for replying, and btw are you also gonna make a video about creating full VR experiences for visionOS using unity? Particularly intrested in how smooth locomotion would work with just hand tracking.
I have been using their latest version and looks like there are many issues with it, I get the same experience and by looking through the Unity forums sounds like this is a well known issue.
That’s something I experienced quite often. I fixed it by erasing all the content from the sim, there is an option in the sim toolbar to do so, then create a new build. Also, try selecting a scene that has daylight. Let me know your results, thanks.
Thanks for video....I have one question related to Hololens..I created an unity application with mrtk in unity...I'm finding a memory leak issue when I'm opening file picker memory gets increased and is not realising...and same thing is happening when I open scanner to scan (scanner is in separate scene and loading as additive scene ) memory increases but not releasing....can you provide me if there is any way to handle this...Thank you
Hey Guys, got it working upto here, however im only able to control the Simulator through the Unity Editor, I am not able to control anything from the Simulator itself? Any Fix for this?
Thanks for your feedback, I think it takes about 1-2 mins to build just the first time, after that I just do a build / replace and it is faster since Unity cashes a lot from the previous build.
📢VERY IMPORTANT: Be sure to watch these videos in chronological order:
- To get the Xcode, visionOS platform, visionOS simulator tools setup watch this video 👉ua-cam.com/video/LeqVHfqRq_I/v-deo.html
- To get Unity PolySpatial packages setup including core visionOS platform watch this video 👉ua-cam.com/video/EtPaYKvzs6M/v-deo.html
If anyone got situation when editor input works, but simulator's doesn't:
Project settings -> Tab for Win, Mac, Linux -> Check "Run In Background checkbox".
It actually works, editor just stops playing when it loses focus.
This is great insights! Thanks for posting it!
Thanks for your post!It helped me a lot.
Fantastic tutorial, for anyone who wants to try this out be careful play to device currently doesn't support ARKit features, so hand tracking and world meshing are not available. To do that you still need to build xcode project in the old way.
That’s great feedback thank you. I also recommend using XR Simulation tools which provide meshing, planes, and hand tracking capabilities.
Thanks for the great video! And I have a question if I want to get information about the room in the simulator like a table or a sofa what do I do? I want to place my model on a table or where I want it to be.
That’s a great question, unfortunately plane detection or meshing doesn’t work with the simulator, this is something you can implement but you would need a physical device to test. Another option would be to use Unity XR simulation tools which support plane and meshing.
Hey! I have this running on the vision pro and it works with eye tracking but the pinch gestures are not recognized. Any tips as to how to fix this?
Awesome, try this:
- Go to File > Build Settings > Player Settings
- Then XR Plug-in Management > Apple visionOS
- Verify that "Hands Tracking Usage Description" is set
- Close the Play To Device App and Re-Open it in your physical device
- Then hit play in Unity again
Let me know if that works, I just tested it in a test project and it is working with what I have.
@@dilmerv Thank you so much! It ended up working anyways randomly, I didn't really change anything but these were very important steps! :)
What are the steps leading up to 18:31? it seems like the beginning steps got cut off. Right now, my "XR Environment" menu is grayed out, I cannot click anything
Hey thanks for your question, go to Player Settings > and then under the standalone platform enable XR Simulation, I am not close to my computer but that should make it so that it becomes enabled.
Thank you! @@dilmerv
Thanks for the great video! And I have a question whether I can play to device with app mode = Virtual Reality
Thank you for the great video, can you make a tutorial to run the MRTK3 Sample scenes on the simulator with Hand and Gaze interaction
You are very welcome, I will look into it, thanks for your feedback!
Thanks for your great video, help me a lot! And can I open multiple bounded volumes in Shared Space from one Unity app like in Swift?
I don’t believe that’s possible, I think in swift you can open one at the time but not multiple at the time. The sim can open multiple but each volume is its own application.
Does the Play to Device feature work in unbounded scenes? When I run the sample unbounded scenes, they give an error about Hands Tracking. If I build it first and then run on the Apple Vision Pro with xcode, it works.
Honestly the latest Play To Device version is super buggy, I had so many problems, when I recorded this video everything was working well but it seems that new updates made by Unity introduced a lot of issues. I will check and let you know but I did get the same error you are getting very often.
@@dilmerv Thanks, do you know which version should I go back to that was least buggy?
I tried to install this version of Polyspatial but it says it requires a unity Pro account. I'm not ready to pay $2k to test it out..
Why are there no shadows in the visionOS simulator? In the unity editor you can see that the boxes cast a shadow but it's missing in visionOS
Great question! I am going to assume that it is to optimize the data transfer, but shadows are currently not showing in the sim.
@@dilmerv thanks for replying, and btw are you also gonna make a video about creating full VR experiences for visionOS using unity? Particularly intrested in how smooth locomotion would work with just hand tracking.
I have had success using play to device on the simulator, but when trying on the actual device Unity spins. I have had no success. Any tips?
I have been using their latest version and looks like there are many issues with it, I get the same experience and by looking through the Unity forums sounds like this is a well known issue.
I would still unity pro license, right?
Yes, that’s correct Sahith. Good question!
Thanks for another great video!
Thanks Thomp I appreciate your feedback!
Iam not able to find why the lighting is missing in simulator and build as well. Could you let us know the fix.
That’s something I experienced quite often. I fixed it by erasing all the content from the sim, there is an option in the sim toolbar to do so, then create a new build. Also, try selecting a scene that has daylight. Let me know your results, thanks.
Thanks for video....I have one question related to Hololens..I created an unity application with mrtk in unity...I'm finding a memory leak issue when I'm opening file picker memory gets increased and is not realising...and same thing is happening when I open scanner to scan (scanner is in separate scene and loading as additive scene ) memory increases but not releasing....can you provide me if there is any way to handle this...Thank you
Hey Guys, got it working upto here, however im only able to control the Simulator through the Unity Editor, I am not able to control anything from the Simulator itself? Any Fix for this?
Thanks for your comment, what version of Unity PolySpatial, Xcode, and visionOS are you currently using?
Thanks for the amazing video!..Can u tell me how much Time it took for you to build in Xcode?(I have m1 mba it took very long)
Thanks for your feedback, I think it takes about 1-2 mins to build just the first time, after that I just do a build / replace and it is faster since Unity cashes a lot from the previous build.
@@dilmerv which Mac silicon version did you use?
First comment 😊
Sweet thanks for watching it!
🥽🎮❤️
Thank you 🙏