Quest 2 user - Not sure if this helps anyone, foillowed along in 2022 and this guide still works and is still relevant. i made crucial mistakes in the beggining, i was working in a project that had both oculus integration packages and XR toolkit, and on XR Management i clicked both XR and Oculus, and i think i caused conflicts i couldnt get the controllers to track or show any change in the transform in scene view. I created a new project.. clicked only the XR stuff, imported only the XRIToolkit and its working a dream as per this guide. Its very helpful, i only wish it dived into a little bit of coding so we could have got used to that but in itself its a very good and useful video, especially appreicated the little break times i wouldnt have done that otherwise and i felt it conducive to staying focused and learning from your video. Cheers!
Invaluable tutorial here! Really easy to follow, and goes to the perfect degree of in-depth-ness(?). The stuff you cover here has me really excited about the future of XR development -- we're really getting somewhere!
Hi Andrew, Great video, as usual - thank you. Though you can tell it was published on April 1st (Prima Aprilis / April Fool's Day). Great overview of all major elements of the XR Interaction Toolkit. In combination with all your previous series on this toolkit - this makes a great resource for the developers.
Hey Andrew! Thank you for this well-structured and easy-to-digest crash course. Been wanting to delve into VR dev for over a year now but I found it hard and intimidating. Currently halfway through this course (really appreciate the breaks! writing this comment while taking a break), and I'm all excited about VR dev again!!
Thanks for the video Andrew. I believe the XR Interaction Toolkit is no longer in prerelease since March, so you should simply find it within the list of Unity Registry packages. I think the reason why you couldn't find it in your search is simply because you typed "xr toolkit" instead of "xr interaction toolkit".
I just double-checked this, and within the version 2021.2.13 used for the video. Unfortunately, the Toolkit is not displayed even when explicitly searching for "XR Interaction Toolkit." But that's a good catch! I didn't realize what I had typed was invalid. So, good thing you pointed that out for when it gets added.
I followed your instructions EXCATLY up to 12:30. I cant hook up my oculus headset to my computer for testing because my computer is too weak. However I can build and run the test. Instead of making the app in VR like I expected, it opened the app like a window which has never happened before
Great tutorial, although I wish someone would cover how to capture inputs in code. I don't have any use for the built in actions other than tracking the controller position. The old system was much cleaner in my opinion. I could easily check for buttons or combinations of buttons and it worked across multiple devices. It was easy enough to check for CommonUsages.primaryButton and you could simply attach an event handler. This new system is super convoluted, despite how flexible it may be. EDIT: For anyone interested, you can do the following to capture inputs in code... public class ActionResponder : MonoBehaviour { private ActionBasedController _controller; void Start() { _controller = GetComponent(); } void Update() { bool isPressed = _controller.selectAction.action.IsPressed(); if (isPressed) {
What settings should you choose if you wanted to target the theoretical apple hmd which doesn’t support OpenXR? Thanks for the great video, always appreciate your work
I'd say stick with OpenXR, they'll definitely have support for OpenXR or else they'll be missing out on a lot. In any way you'll see an update on launch of after with Apple's HMD supported. We're still not fully sure if it'll be gaming or productivity focussed and how its XR implementation will work with tracking and such but its good you're already thinking about it.
Hi Andrew, thanks for the great great video! By the way, in this tutorial, the interaction is only enabled by the controller, not the bare hands? What can I refer if I want to interact with the virtual objects by my bare hands in HTC Vive Focus3?
This is currently only for the controller, I know Unity is working on some built-in hand tracking support. But, I don't know what hardware it'll support.
I'm struggling to wrap my head around something here. For some reason they decided that by default, teleport is hooked up to the "Select" action. Which is grip. If I change this to, let's say a primary button in the actions for 'Select', it works.. But, that also affects grabbing objects. Both use the 'Select' action. Is there a simple way that I'm not grasping to assign _any_ custom action to say, the teleportation? None of the components as far as I can tell have any way of customizing what function is hooked up to which action. Do I have to dive into custom scripts here? Or could I use two different ray interactors, one for grabbing, one for teleporting? Then how would that be assigned if so.. When you feel you're so close, but yet so far away :p
Hi Andrew! I'm using Vive, the headset tracking is working well but it can't track my hand grips, how can I solve the problem? I am quite sure my settings is same as you, thanks a lot!
It could be a number of things, make sure you have the Vive Controller Profile added, the Input Action Manager is set up in your scene, and the Actions are set up on each controller. If all that fails, restart SteamVR.
I currently experience a not working controllers while using OpenXR + Unity 2020.2.31f1 + SteamVR client newest version, which came from beta and overridden a working one on the Steam Store. My controllers get stuck on ground for every project, where I use Valve Hardware. funny enough ...Oculus Quest 2 controllers work for SteamXR orax OculusXR runtime.
Hey I followed all the steps, but when I hit play all I get is a message that says "unable to start Oculus Xr plugin" and nothing works. Can anyone help?
Great video, Thanks! Don't suppose you know if it is possible with the XR toolkit to get a pose for the lighthouses (TrackingReferences)? I need to display them but can't seem to get a pose for them to come through.
Hey Joe! Not with OpenXR, I believe that's on HTC to implement at the moment. There's some stuff in the Input Actions for Tracked Devices like Lighthouses. However, I don't know exactly what they're leftover from.
Hello, thanks for the awesome video! Well paced! But I am a bit confused about the hands situation: on how to combine the ray interactors with the hands we created previously. Or should we use the new Left hand-ray and duplicat for right hand? Thanks!
If you're using Velocity or Kinematic tracking, it takes a frame to apply the physics. Try switching to Instantaneous tracking to see if you get the same results. It could be a different issue.
What if I want to handle all trigger events, not just ones that interact with virtual objects (e.g. when I want to submit the current controller position to a script on trigger press)? So far I wasn't able to figure that one out with the XRI Toolkit.
You'd want to access the Input Actions directly. I'd recommend looking into some general Input Manager tutorials. They should point you in the right direction.
I'm having trouble with the basic interactions at 22:00; the hand is far from the face, and has inverted controls. I followed all of the steps exactly leading up to this point, and the only difference between this video and my project is the version: Yours is 2021.2.13f1 while mine is 2021.3.5f1. Do you know any solutions to this? Thank you.
The tracking can get a bit wonky if it's out of sync for a moment. It should auto-correct, but I'd restart your hardware first. Then, check to see that the prefab for the hand is being childed to the controller at a zero'd out position. So, if you run the project and select the hand it's position should be (0, 0, 0).
How are you supposed to rebind the teleport key? I've gone through the scripts and the settings numerous times, and I just can't find the place to rebind the teleport button. Who decided that the GRIP should be the default key for teleporting things? Moreover, why is the velocity tracking movement type so jittery? It's super springy, stuttery and it takes like half a button press for your controllers to actually grab them.
I do have a hands project that I may make a video on. I'm still kinda waiting on some of XR Toolkit's future features. But, I may just go ahead and do it.
Great video, Andrew! Quick question about the UI, I've been trying to use a canvas on "Screen Space - Overlay" mode, but it doesn't render on my HMD. Do you have any idea why this might happen? Do you know any workaround for this? I am using a Quest 2. Thanks!
Hey Mauricio! Of course, you'd want to use Screen Space - Camera. This is is a blend of Overlay and World Space, just make sure you drop in the Camera from the Rig/Origin for the Canvas to render properly.
Amazing tutorial, they've been updating this a lot which is great and at the same time bothersome. Your last Offset Interactable (update) video is sadly out of date now due to deprecation of the BaseInteractor, I've been working on this for a few days and can't get my head around the new interactor. Would highly appreciate an update on the offset interactable objects :) Love your vids, keep up the great work
Thanks, Spicy! Yeah, I think I had released that video the same week the Attach Transform logic was changed. I'll double-check and see if I can get it working.
Hello, Great videos so far! Looking for some help on this from you or the community, as I get 18 minutes in and my index controllers are not being tracked properly. I have made sure: 1. XR Origin has the input action manager component, and the asset set is the "XRI Default Input Actions" that came with the Starter Assets for the XR interaction toolkit. 2. The controllers have the XR controller component (action based) and configured using the starter assets as well as the model prefab being set. 3. Prefab doesn't have collider, but the left/right controllers do. 4. The Player settings have "active input handling" set to "both". SteamVR is set as playmode openXR runtime to be used. Headset does track in game, and hands do work in steamVR home. The only difference to the video is that XR interaction toolkit updated to 2.0.1. Any suggestions is much appreciated.
I had someone in my Discord have a similar issue. And, I even had them send their project for me to test and it worked fine on my end. So, there may be some weird issue going around. Do you know what version of SteamVR and Unity you're using?
@@VRwithAndrew Unity 2021.2.18f1 (the available 2021 version in unity hub), and SteamVR 1.21.12 (built 3/11/22). I can try with steamvr beta release to see if that does the trick. At first i thought it was my project as its been a testing ground for a few things now, but a tested a fresh project with these steps and it also has the controllers not tracking. not a direct comparison but in another project (same versions of openxr, unity) the demo hurricane vr rig from that asset works without problems. ill be throwing that asset into this project to see if it works or not or if it can lead to some clues about the setup as i know it has a nice setup gui button. I'll circle back if i find a solution, thanks for the reply!
@@samnowak2445 I did not. I wasn't able to figure out why they are not being tracked. I decided to just go with VRIF (virtual reality interaction framework) instead as I think it will be best to let someone else handle the framework updates when (not if) new hardware or VR/XR/AR related API changes occur. Its half off currently with the sale, and so worth it considering how much it has to offer. If you want physics related interactions that can't be faked with coded movement, HurricaneVR/hexabody would be the best alternative. I haven't checked, but if you are dead set on getting a non-asset solution, try out Andrew's discord as it may have a solution.
I had the same issue, and i found that i had oculus selected in the XR Plug-in Management, so i switched it to Open XR and it worked for me. Hope that helps.
Hey Andrew, thank you so much for this tutorial, it really helped me to understand much better how to set up interactables and locomotion. I've been searching for a way to snap turn without a fixed debounce time, so that every time I turn my joystick to the side it only turns once, regardless of how much time the joystick is tilted. But then, if I release the joystick to the center I want to be able to snap turn immediately. I have already turned to code to try to implement this using the Hold interaction on the Snap Turn action and checking the canceled phase but I am having trouble getting it to work. All apps I've used have snap turn implemented this way, am I complicating a simple thing or do all apps implement a custom snap turn action?
This is something I wasn't a huge fan of, either! I like being able to snap as quickly as I can move the joystick above/below the threshold. I looked into this briefly a while ago, and the issue comes from the Snap Turn Provider's built-in debounce timer. I don't think this was always the case, but you should be able to set the value to 0 in the Inspector.
The issue with this is that with a 0 in the debounce time it will continue turning as long as the joystick is above the threshold. I got the behavior I wanted by adding 'if(input.magnitude == 0f) m_TimeStarted = 0f;' to SnapTurnProvierBase.cs after 'var input = ReadInput();'. Had to move the package from Library/PackageCache to Packages though...
Hi Andrew! Thank you so much for all of your helpful videos! I am currently working on project using XRI Toolkit 2.0 and while all of the default functionality is working well, it is not recongizing my Steam VR tracker pucks (specifically tundra tracker) as a tracked device. It used to work with the toolkit 1.0 and OpenVR. Have you tried getting it to work with steamvr trackers, or do you I need to make a new input action reference for trackers instead? Thanks!
Unfortunately, at least Vive Trackers are not supported to run in OpenXR. It's something a lot of people have asked about, and I've even asked Unity about it. I believe it's currently on the hardware manufacturer to get OpenXR working.
Hello Andrew, Thank you for these! I have struggled with this problem for some days: How can I just get the for example, the response of only right hand controller, when I press the trigger I will switch into a new scene
Hey thanks for the vid, it helped a lot :D I do have a question how do u make a interactor that can move and rotate objects with 2 hands? I've seen your vid about them but its a bit scarce of info on how to do it ( it just shows us how to change their color rly :/ ) And i've tried it myself but the best i got is moving and rotation with only one hand. Any ideas? Thanks in advance :)
I'm guessing that it's still somewhat in the works. Since it takes a few extra steps to get things like two-handed grabbables working. Which, I'll be doing a video on that once it's appropriate. However, here's a brief rundown of how I've experimented with it. Extend GrabInteractable, and add the MuliSelect property, then extend the Interaction Manager and override the ResolveExistingSelect function so it just returns true. This will kinda get you started, but it may be best to wait.
Thanks for the response :) I got a wierd result as the cube just rotates wierdly (kinda random) when i try to use the secound hand and it doesnt effect the object if I try to move or rotate it. Maybe because i need to wait no? or should it work?
@@diogooliveira5461 I would wait. Even for this implementation, you'd need to manually call Pickup/Drop based on how many Interactors are trying to grab the Interactable.
Hi, thanks a lot for the video. It helped me a lot. However, one question came up, is it possible to use a Rigid Body instead of a Character Controller?
Hi Andrew, thanks for this. I didn't update my VR project to the latest XR toolkit release for a long while and this video really helped. I have a question about grab interactables, I have a object I can pick up which I previously fit into right or left hand with a different attach transform for each hand which I controlled from code. With the current release that piece is broken, with default settings the object doesn't seem to use the attach transform at all, but there is an "attach point compatibility mode" which is marked as obsolete. Even with that it's not perfect (works only after 1st grab). Do you have any insight about this, I imagine this must be a fairly common problem, perhaps my approach is wrong?
For anyone's info I did find a way to do this, bypassing the attach transform idea, just modifying the position directly depending which hand grabbed it.
TYSM, I figured a few things out before but got stuck thinking... Am I doing this right? Unity has so many settings and packages... I might learn C# and start working on my VR game this summer! Heard C# is a lot like Java.
You're welcome! Yeah, it's a bit confusing at first. A lot of what I know has come through trial-and-error and just exploring. It certainly takes time, but it's been a rewarding process.
Hello, I didn't really find a clear answer for this, but if you have multiple grab interactables stacked on top of each other and you want to highlight the one you are going to grab when hovering that object. How would you approach this? I didn't really find a variable that would clearly show what object the interactor plans on grabbing.
A newer version of XRI has object filtering, but I haven't explored it really. XRI primarily uses distance to find the nearest Interactable that the Interactor is hovering. You may be able to invoke this functionality manually on an Interactor to find the closest each time Hover Enter/Exit is triggered.
I have a question about the hands. When I start the project my hands just go flying everywhere and I do not know what the problem is. Is there anything I need to fix?
Check if they have rigidbodies or colliders on them, that can cause some weird behavior. Or, make sure they're zero'd out In position when attached to the controller. If they're offset at all in the prefab it'll give some weird results as well.
hey Andrew, AMAZING tutorial as always! I hope you can help me for a sec.. I followed your tutorial and got everything working (with Valve Index, Unity 2021.3.11f, latest SteamVR). I repeated the tutorial with my class Valve Index (same model as mine at home, same Steam and Unity versions) and suddenly the grab interaction won't work. My controllers are tracking fine, teleport with ray and continuous move and turn all working but the grab is just no happening. Any idea why is that? did that happened to you before?
So, it sounds like the input is working fine. So, you may want to check if you're getting any Warnings in the Console. The collider used for the Direct Interactor may not be a trigger, or the object you're trying to grab doesn't have a rigidbody, or possibly the correct type of Interactable.
@@VRwithAndrew thanks for the quick answer! So the object to grab is from the XR preset (the grabbable cube) so I guess I need to check the Direct Interactor! Thank you so much!
Hi! This is probably a newbie question but im looking into starting to develop VR projects and wondering if you can build for webxr with the xr toolkit, is it the same or do the toolkits work with eachother?
Hey man! Just wanna let you know XR Origin (Former XR Rig) doesn't work properly with XR Device Simulator. Is there any solution? Previously XR Rig worked fine.
Thank you for this great tutorial. I have tried setting up Action based rig and it works correctly except when I added open vr loader for steam build, and nothing is working anymore. Is there a way to use action based xr rig with open vr loader?
@@VRwithAndrew Thank you, I found it and it's a great tutorial. The only reason I wanted to check Open VR Loader, is because the steam vr micro transaction window doesn't appear when using OpenXR, even though everything else including the action based rig works correctly on steam build as long as open vr loader is unchecked and openxr is checked. Is there a way to get in app purchase screen and steam menu button to work with the app without having to check Open VR loader?
@@VRwithAndrew Thank you I appreciate it. I do have the files from SteamVR and I have steam works. But when I call steam works micro transaction the purchase window appears on desktop rather than in VR if Open XR is checked while Open VR Loader is not checked and the native steam vr menu doesn't open clicking the menu button on left controller. Everything else including xr interaction toolkit works perfectly. That leads me to believe a simple tweak might fix this and in that case I wouldn't have to do a hack to create steam vr build.
Hi, I am following the tutorial so far but when I try testing it says actions.json file not founf. I was wondering if its a version problem. Would appreciate any help!
@@VRwithAndrew but also another interesting thing that happens is I can't see the plane with the headset, I can only see the horizon and it moves with my headset. Know what I mean? I tried adjusting the position of the plane but it keeps reverting back for some reason
@@riamatapurkar6844 Hmmm, that usually happens if something is wrong with your tracking. Or, there's a camera in the scene that is rendering to the screen rather than the headset. I'd try resetting the hardware, and making a new scene with the simple XR Rig.
I am currently doing the Unity Create with VR pathway, using a Valve Index; my problem is, that "Select" seems to be called the second I just brush against the grip sensor of the Index controllers, instead of using the grip force and a threshold like in any game I have seen so far. Is this standard behaviour for the OpenXR toolkit and if so, how do I change that?
@@VRwithAndrew The Input Manager seems to be disabled when using the XR Interaction Toolkit. I have been digging around for a few hours now and it seems that the problem is with SteamVR itself. The problem is that Select Action is called by gripPressed, which according to the Unity OpenXR documentation is defined by the device manufacturer. So I went to the button mapping screen of SteamVR when running my project and I found out that they indeed mapped the "grip" value (which determines how much you wrapped you hand around the controller) to trigger "gripPressed" when it exceeds 25%. You could make your own binding in SteamVR, but that would mean that every player of the game would need to do that themselves (since without the SteamVR plugin, you cannot ship your custom binding with the game) and it doesnt even seem to be possible to even bind "gripPressed" to use "gripForce" (so that you actually need to put pressure on the controller). So I went to make a custom binding in the Input Action Asset (which Unity recommends anyway and explicitly states to NOT use the standard "XR Controller") and use "gripForce" to trigger Select Action. This somewhat worked but left me with a few other problems: 1. The standard haptic feedback from the SteamVR binding is still triggering 2. You cannot set a different grab and release threshold, which can lead to "grabbing jitter" 3. Grabbing stuff still doesnt feel right like it does natively with the SteamVR Plugin I am kinda starting to get frustated to the point of giving up on OpenXR and just grind along with the SteamVR Plugin, even though Valve stated that they shifted development to OpenXR and wont work on the SteamVR Plugin anymore (which I fear could lead to problems with current or upcoming Unity versions).
Quest 2 user - Not sure if this helps anyone, foillowed along in 2022 and this guide still works and is still relevant. i made crucial mistakes in the beggining, i was working in a project that had both oculus integration packages and XR toolkit, and on XR Management i clicked both XR and Oculus, and i think i caused conflicts i couldnt get the controllers to track or show any change in the transform in scene view. I created a new project.. clicked only the XR stuff, imported only the XRIToolkit and its working a dream as per this guide.
Its very helpful, i only wish it dived into a little bit of coding so we could have got used to that but in itself its a very good and useful video, especially appreicated the little break times i wouldnt have done that otherwise and i felt it conducive to staying focused and learning from your video.
Cheers!
Great tutorial! I can't believe how far the XR Toolkit has come since your first tutorials on it.
It's finally not a "Pre-Release" tagged Alpha
Now its more like a late Beta
Thanks, Pico! Happy to see you're still around. :)
Invaluable tutorial here! Really easy to follow, and goes to the perfect degree of in-depth-ness(?). The stuff you cover here has me really excited about the future of XR development -- we're really getting somewhere!
Much appreciated, Aidan!
Great tutorial! Please don't stop . You have a great flow, easy to understand and humble approach. Subbed!
Much appreciated!
Most excellent. I've spent the last 3 days looking for up-to-date information on the XR Tool Kit and this was great. Thank you very much.
Happy to help, George!
Hi Andrew, Great video, as usual - thank you.
Though you can tell it was published on April 1st (Prima Aprilis / April Fool's Day).
Great overview of all major elements of the XR Interaction Toolkit.
In combination with all your previous series on this toolkit - this makes a great resource for the developers.
Much appreciated, Marcin! Ahaha, yeahhh, when I started uploading on the first of the month. I didn't consider April Fool's day.
This is the VR tutorial you have been looking for.
That's the plan.
This is the clearest XR origin setup tutorial I have ever seen. Thank you so much!
You're quite welcome!
Thanks a lot Andrew, it's well paced and easy to comprehend.
Also thanks Andrew Bro, you've given the perfect gift idea for my bro
Thanks, Reho! It's been a while since I've done a traditional tutorial. So, glad you found it helpful.
Thanks !
Great! Ease to get it all, because of your simple way to explain. Keep doing. :)
Honestly By FAR my favourite tutorial so far. Amazing job keep it up my dude
Thanks, Jam!
I love how calm you explain. Thank you for this. Greetings from Colombia. :)
Oh thanks, it's just how I talk. :)
Amazing! Never knew about the presets for the XR Controller. You are a superstar!
Happy to help, Charlie!
excellent explanation, learned something new 🔥🔥🔥
Happy to help! 🔥
Hey Andrew! Thank you for this well-structured and easy-to-digest crash course. Been wanting to delve into VR dev for over a year now but I found it hard and intimidating.
Currently halfway through this course (really appreciate the breaks! writing this comment while taking a break), and I'm all excited about VR dev again!!
Yeah, it can certainly be a difficult path to get started on. But, happy to hear you're giving it a shot!
Thanks for the video Andrew. I believe the XR Interaction Toolkit is no longer in prerelease since March, so you should simply find it within the list of Unity Registry packages. I think the reason why you couldn't find it in your search is simply because you typed "xr toolkit" instead of "xr interaction toolkit".
I just double-checked this, and within the version 2021.2.13 used for the video. Unfortunately, the Toolkit is not displayed even when explicitly searching for "XR Interaction Toolkit." But that's a good catch! I didn't realize what I had typed was invalid. So, good thing you pointed that out for when it gets added.
If you get an alert saying "Interaction layer 31 is not set to 'teleport'" Skip ahead to the teleportation section and do that first.
Such a great tutorial- patient, clear and wisely put, thanks so much :-)
Much appreciated, Eleanor!
Beverage time convinced me subscribe
Everyone needs a good beverage time!
Good stuff! Definitely keen to hop back into VR and follow along with this.
Much appreciated, Jacques!
Very, very helpful, thanks a lot!
I've been hesitant with switching from the steamvr sdk because i lose finger tracking for the index, but its about time i upgrade. great video!
True! I'm hoping it eventually becomes available.
Subscribed specifically for beverage time
It's the best time
Thank you Andrew! This is really good!
Happy to help!
this is good stuff!!! Please continue!!!!
Much more to come!
Brilliant tutorial!! 👏
This was excellent
Thank you, Max
You. Are. A. Legend. Thank you very much!
Happy to help, Gamovore!
Would love to see you do a video or part of a video on adding a jump to the xr player using this new system!
Excellent crash course. thanks
You're welcome
I needed to download and install the Steam VR package to get my controllers to work.
I followed your instructions EXCATLY up to 12:30. I cant hook up my oculus headset to my computer for testing because my computer is too weak. However I can build and run the test. Instead of making the app in VR like I expected, it opened the app like a window which has never happened before
Everytime i press play my hands are not attached, it just stay where it is and I can't see them, anyone knows how to fix this please ?
Very nice thanks =)
You're welcome!
Great stuff!
Thanks, Proto!
So my oculus hands are stuck in one specific point, I point the lasers around but not move the hands, any ideas how to fix that? Thanx
Great tutorial, although I wish someone would cover how to capture inputs in code. I don't have any use for the built in actions other than tracking the controller position. The old system was much cleaner in my opinion. I could easily check for buttons or combinations of buttons and it worked across multiple devices. It was easy enough to check for CommonUsages.primaryButton and you could simply attach an event handler. This new system is super convoluted, despite how flexible it may be.
EDIT: For anyone interested, you can do the following to capture inputs in code...
public class ActionResponder : MonoBehaviour
{
private ActionBasedController _controller;
void Start()
{
_controller = GetComponent();
}
void Update()
{
bool isPressed = _controller.selectAction.action.IsPressed();
if (isPressed)
{
I have videos on both the Input System and how it related to VR Input and creating Custom Input Actions and how to read their values.
Thank you!
You're welcome!
What settings should you choose if you wanted to target the theoretical apple hmd which doesn’t support OpenXR? Thanks for the great video, always appreciate your work
I'd say stick with OpenXR, they'll definitely have support for OpenXR or else they'll be missing out on a lot.
In any way you'll see an update on launch of after with Apple's HMD supported.
We're still not fully sure if it'll be gaming or productivity focussed and how its XR implementation will work with tracking and such but its good you're already thinking about it.
@@StiekemeHenk Agee with Stiekeme on this one!
Hi Andrew, thanks for the great great video! By the way, in this tutorial, the interaction is only enabled by the controller, not the bare hands? What can I refer if I want to interact with the virtual objects by my bare hands in HTC Vive Focus3?
This is currently only for the controller, I know Unity is working on some built-in hand tracking support. But, I don't know what hardware it'll support.
How to add animated hands for index? I dont want to use silly spheres.
I'm struggling to wrap my head around something here. For some reason they decided that by default, teleport is hooked up to the "Select" action. Which is grip. If I change this to, let's say a primary button in the actions for 'Select', it works.. But, that also affects grabbing objects. Both use the 'Select' action. Is there a simple way that I'm not grasping to assign _any_ custom action to say, the teleportation? None of the components as far as I can tell have any way of customizing what function is hooked up to which action.
Do I have to dive into custom scripts here? Or could I use two different ray interactors, one for grabbing, one for teleporting? Then how would that be assigned if so..
When you feel you're so close, but yet so far away :p
Hi Andrew!
I'm using Vive, the headset tracking is working well but it can't track my hand grips, how can I solve the problem?
I am quite sure my settings is same as you, thanks a lot!
It could be a number of things, make sure you have the Vive Controller Profile added, the Input Action Manager is set up in your scene, and the Actions are set up on each controller. If all that fails, restart SteamVR.
Hi, when I press play, the XR Origin setups that I do are deleted. What am I doing wrong?
very useful thank you
You're welcome
im having a issue where when i go to the grab interactable and then interactable events only 4 things pop up
nevermind i was on the socket interactor
I currently experience a not working controllers while using OpenXR + Unity 2020.2.31f1 + SteamVR client newest version, which came from beta and overridden a working one on the Steam Store. My controllers get stuck on ground for every project, where I use Valve Hardware. funny enough ...Oculus Quest 2 controllers work for SteamXR orax OculusXR runtime.
Hey I followed all the steps, but when I hit play all I get is a message that says "unable to start Oculus Xr plugin" and nothing works. Can anyone help?
When selecting the error in the Console does it show any additional info?
Great video, Thanks! Don't suppose you know if it is possible with the XR toolkit to get a pose for the lighthouses (TrackingReferences)? I need to display them but can't seem to get a pose for them to come through.
Hey Joe! Not with OpenXR, I believe that's on HTC to implement at the moment. There's some stuff in the Input Actions for Tracked Devices like Lighthouses. However, I don't know exactly what they're leftover from.
very very good
Many many thanks!
it wont show xr oirgin action based
Hello, thanks for the awesome video! Well paced!
But I am a bit confused about the hands situation: on how to combine the ray interactors with the hands we created previously. Or should we use the new Left hand-ray and duplicat for right hand?
Thanks!
You can, but I'd first recommend looking into my video "Setting up Multiple Interactors for Unity XR".
HELP!
When i grab my objects they seem to have a delay on them when I move my hand. I have tried fixing it but found no solution.
If you're using Velocity or Kinematic tracking, it takes a frame to apply the physics. Try switching to Instantaneous tracking to see if you get the same results. It could be a different issue.
What if I want to handle all trigger events, not just ones that interact with virtual objects (e.g. when I want to submit the current controller position to a script on trigger press)? So far I wasn't able to figure that one out with the XRI Toolkit.
You'd want to access the Input Actions directly. I'd recommend looking into some general Input Manager tutorials. They should point you in the right direction.
thanks
you're welcome
I'm having trouble with the basic interactions at 22:00; the hand is far from the face, and has inverted controls. I followed all of the steps exactly leading up to this point, and the only difference between this video and my project is the version: Yours is 2021.2.13f1 while mine is 2021.3.5f1. Do you know any solutions to this? Thank you.
The tracking can get a bit wonky if it's out of sync for a moment. It should auto-correct, but I'd restart your hardware first. Then, check to see that the prefab for the hand is being childed to the controller at a zero'd out position. So, if you run the project and select the hand it's position should be (0, 0, 0).
@@VRwithAndrew That completely fixed it! Thanks for the quick reply. This was a great tutorial as well, thank you for your help!
Everything going great, until 14:21. No Samples folder in mine and no left/right presets show up when I hit those sliders.
Did you Download and Import the samples from the Package Manager?
Nicee thanks :)
You're welcome :)
Lets goooooooooo!
GOOO!!!
Can you please help me learn this and teach me how to make a character jump? I'll pay, I'm so lost and it makes me so sad
How are you supposed to rebind the teleport key? I've gone through the scripts and the settings numerous times, and I just can't find the place to rebind the teleport button. Who decided that the GRIP should be the default key for teleporting things? Moreover, why is the velocity tracking movement type so jittery? It's super springy, stuttery and it takes like half a button press for your controllers to actually grab them.
hey, did you find out if there is a way to use another input for teleport? the grip to teleport is not the nicest key :-P
This was super helpful thank you so much. Any plans for setting up Hands/Hand tracking in XR TK 2.0?
I do have a hands project that I may make a video on. I'm still kinda waiting on some of XR Toolkit's future features. But, I may just go ahead and do it.
@@VRwithAndrew very cool that would be amazing ❤️
Great video, Andrew! Quick question about the UI, I've been trying to use a canvas on "Screen Space - Overlay" mode, but it doesn't render on my HMD. Do you have any idea why this might happen? Do you know any workaround for this? I am using a Quest 2. Thanks!
Hey Mauricio! Of course, you'd want to use Screen Space - Camera. This is is a blend of Overlay and World Space, just make sure you drop in the Camera from the Rig/Origin for the Canvas to render properly.
Amazing tutorial, they've been updating this a lot which is great and at the same time bothersome.
Your last Offset Interactable (update) video is sadly out of date now due to deprecation of the BaseInteractor, I've been working on this for a few days and can't get my head around the new interactor.
Would highly appreciate an update on the offset interactable objects :)
Love your vids, keep up the great work
Thanks, Spicy! Yeah, I think I had released that video the same week the Attach Transform logic was changed. I'll double-check and see if I can get it working.
@@VRwithAndrew Cheers!
@@thatkarairamen This worked for me in the most recent version.
public class OffsetInteractable : XRGrabInteractable
{
protected override void OnSelectEntering(SelectEnterEventArgs args)
{
base.OnSelectEntering(args);
MatchAttachmentPoints(args.interactorObject);
}
protected void MatchAttachmentPoints(IXRInteractor interactor)
{
if(IsFirstSelecting(interactor))
{
bool IsDirect = interactor is XRDirectInteractor;
attachTransform.position = IsDirect ? interactor.GetAttachTransform(this).position : transform.position;
attachTransform.rotation = IsDirect ? interactor.GetAttachTransform(this).rotation : transform.rotation;
}
}
private bool IsFirstSelecting(IXRInteractor interactor)
{
return interactor == firstInteractorSelecting;
}
}
@@VRwithAndrew Thanks a mil!
Hello,
Great videos so far! Looking for some help on this from you or the community, as I get 18 minutes in and my index controllers are not being tracked properly. I have made sure: 1. XR Origin has the input action manager component, and the asset set is the "XRI Default Input Actions" that came with the Starter Assets for the XR interaction toolkit. 2. The controllers have the XR controller component (action based) and configured using the starter assets as well as the model prefab being set. 3. Prefab doesn't have collider, but the left/right controllers do. 4. The Player settings have "active input handling" set to "both". SteamVR is set as playmode openXR runtime to be used. Headset does track in game, and hands do work in steamVR home. The only difference to the video is that XR interaction toolkit updated to 2.0.1. Any suggestions is much appreciated.
I had someone in my Discord have a similar issue. And, I even had them send their project for me to test and it worked fine on my end. So, there may be some weird issue going around. Do you know what version of SteamVR and Unity you're using?
@@VRwithAndrew Unity 2021.2.18f1 (the available 2021 version in unity hub), and SteamVR 1.21.12 (built 3/11/22). I can try with steamvr beta release to see if that does the trick. At first i thought it was my project as its been a testing ground for a few things now, but a tested a fresh project with these steps and it also has the controllers not tracking. not a direct comparison but in another project (same versions of openxr, unity) the demo hurricane vr rig from that asset works without problems. ill be throwing that asset into this project to see if it works or not or if it can lead to some clues about the setup as i know it has a nice setup gui button. I'll circle back if i find a solution, thanks for the reply!
Hey @@stonegoats4537 did you find a solution to this? I am experiencing the same issue.
@@samnowak2445 I did not. I wasn't able to figure out why they are not being tracked. I decided to just go with VRIF (virtual reality interaction framework) instead as I think it will be best to let someone else handle the framework updates when (not if) new hardware or VR/XR/AR related API changes occur. Its half off currently with the sale, and so worth it considering how much it has to offer. If you want physics related interactions that can't be faked with coded movement, HurricaneVR/hexabody would be the best alternative. I haven't checked, but if you are dead set on getting a non-asset solution, try out Andrew's discord as it may have a solution.
I had the same issue, and i found that i had oculus selected in the XR Plug-in Management, so i switched it to Open XR and it worked for me. Hope that helps.
Hey Andrew, thank you so much for this tutorial, it really helped me to understand much better how to set up interactables and locomotion.
I've been searching for a way to snap turn without a fixed debounce time, so that every time I turn my joystick to the side it only turns once, regardless of how much time the joystick is tilted. But then, if I release the joystick to the center I want to be able to snap turn immediately. I have already turned to code to try to implement this using the Hold interaction on the Snap Turn action and checking the canceled phase but I am having trouble getting it to work.
All apps I've used have snap turn implemented this way, am I complicating a simple thing or do all apps implement a custom snap turn action?
This is something I wasn't a huge fan of, either! I like being able to snap as quickly as I can move the joystick above/below the threshold.
I looked into this briefly a while ago, and the issue comes from the Snap Turn Provider's built-in debounce timer.
I don't think this was always the case, but you should be able to set the value to 0 in the Inspector.
The issue with this is that with a 0 in the debounce time it will continue turning as long as the joystick is above the threshold. I got the behavior I wanted by adding 'if(input.magnitude == 0f) m_TimeStarted = 0f;' to SnapTurnProvierBase.cs after 'var input = ReadInput();'. Had to move the package from Library/PackageCache to Packages though...
Hi Andrew! Thank you so much for all of your helpful videos! I am currently working on project using XRI Toolkit 2.0 and while all of the default functionality is working well, it is not recongizing my Steam VR tracker pucks (specifically tundra tracker) as a tracked device. It used to work with the toolkit 1.0 and OpenVR. Have you tried getting it to work with steamvr trackers, or do you I need to make a new input action reference for trackers instead? Thanks!
Unfortunately, at least Vive Trackers are not supported to run in OpenXR. It's something a lot of people have asked about, and I've even asked Unity about it. I believe it's currently on the hardware manufacturer to get OpenXR working.
Hello Andrew, Thank you for these! I have struggled with this problem for some days: How can I just get the for example, the response of only right hand controller, when I press the trigger I will switch into a new scene
I'd probably create a new action, then used the input events from it to call the SceneManager functionality.
@@VRwithAndrew Thank you! I actually managed to do this! I will update my method later, its a little different from your way!
can be used here Passthrough ?
Does anyone know from which asset do those blue hands come from ?
You'll find the projects used in the Description under Resources. The hands are from the Escape Room project.
Hey thanks for the vid, it helped a lot :D
I do have a question how do u make a interactor that can move and rotate objects with 2 hands?
I've seen your vid about them but its a bit scarce of info on how to do it ( it just shows us how to change their color rly :/ ) And i've tried it myself but the best i got is moving and rotation with only one hand. Any ideas?
Thanks in advance :)
I'm guessing that it's still somewhat in the works. Since it takes a few extra steps to get things like two-handed grabbables working. Which, I'll be doing a video on that once it's appropriate.
However, here's a brief rundown of how I've experimented with it. Extend GrabInteractable, and add the MuliSelect property, then extend the Interaction Manager and override the ResolveExistingSelect function so it just returns true.
This will kinda get you started, but it may be best to wait.
Thanks for the response :)
I got a wierd result as the cube just rotates wierdly (kinda random) when i try to use the secound hand and it doesnt effect the object if I try to move or rotate it.
Maybe because i need to wait no? or should it work?
@@diogooliveira5461 I would wait. Even for this implementation, you'd need to manually call Pickup/Drop based on how many Interactors are trying to grab the Interactable.
@@VRwithAndrew Yeah i will wait for now :)
Getting intense with Andrew...
Hi, thanks a lot for the video. It helped me a lot. However, one question came up, is it possible to use a Rigid Body instead of a Character Controller?
Yep! You'd just have to build a lot of it yourself though.
Hi Andrew, thanks for this. I didn't update my VR project to the latest XR toolkit release for a long while and this video really helped.
I have a question about grab interactables, I have a object I can pick up which I previously fit into right or left hand with a different attach transform for each hand which I controlled from code. With the current release that piece is broken, with default settings the object doesn't seem to use the attach transform at all, but there is an "attach point compatibility mode" which is marked as obsolete. Even with that it's not perfect (works only after 1st grab). Do you have any insight about this, I imagine this must be a fairly common problem, perhaps my approach is wrong?
For anyone's info I did find a way to do this, bypassing the attach transform idea, just modifying the position directly depending which hand grabbed it.
TYSM, I figured a few things out before but got stuck thinking... Am I doing this right?
Unity has so many settings and packages...
I might learn C# and start working on my VR game this summer! Heard C# is a lot like Java.
Especially the socket is a good one to understand 😂 sheesh this makes it so much easier!
No way, smooth locomotion!
You're welcome! Yeah, it's a bit confusing at first. A lot of what I know has come through trial-and-error and just exploring. It certainly takes time, but it's been a rewarding process.
Hello, I didn't really find a clear answer for this, but if you have multiple grab interactables stacked on top of each other and you want to highlight the one you are going to grab when hovering that object. How would you approach this? I didn't really find a variable that would clearly show what object the interactor plans on grabbing.
A newer version of XRI has object filtering, but I haven't explored it really. XRI primarily uses distance to find the nearest Interactable that the Interactor is hovering. You may be able to invoke this functionality manually on an Interactor to find the closest each time Hover Enter/Exit is triggered.
I have a question about the hands. When I start the project my hands just go flying everywhere and I do not know what the problem is. Is there anything I need to fix?
Check if they have rigidbodies or colliders on them, that can cause some weird behavior. Or, make sure they're zero'd out In position when attached to the controller. If they're offset at all in the prefab it'll give some weird results as well.
@@VRwithAndrew Unfortunately it still does not work. Everything else works fine but the tracking of the controllers does not seem to be working.
What is the point of making your own controllers as opposed to just using the ones OpenXR comes with?
I'm not sure what you mean. OpenXR doesn't include controllers.
I heard versions as Virgins and died laughing 0:28
Ahaha, glad that I could give you a laugh! Even if it was by accident.
hey Andrew, AMAZING tutorial as always! I hope you can help me for a sec.. I followed your tutorial and got everything working (with Valve Index, Unity 2021.3.11f, latest SteamVR).
I repeated the tutorial with my class Valve Index (same model as mine at home, same Steam and Unity versions) and suddenly the grab interaction won't work. My controllers are tracking fine, teleport with ray and continuous move and turn all working but the grab is just no happening. Any idea why is that? did that happened to you before?
So, it sounds like the input is working fine. So, you may want to check if you're getting any Warnings in the Console. The collider used for the Direct Interactor may not be a trigger, or the object you're trying to grab doesn't have a rigidbody, or possibly the correct type of Interactable.
@@VRwithAndrew thanks for the quick answer! So the object to grab is from the XR preset (the grabbable cube) so I guess I need to check the Direct Interactor! Thank you so much!
Hi! This is probably a newbie question but im looking into starting to develop VR projects and wondering if you can build for webxr with the xr toolkit, is it the same or do the toolkits work with eachother?
If you have a method for building for WebXR, it's a possibility. But, I can't say for sure.
@@VRwithAndrew thank you for the answer!
Hey man! Just wanna let you know XR Origin (Former XR Rig) doesn't work properly with XR Device Simulator. Is there any solution? Previously XR Rig worked fine.
Honestly, I've largely stayed away from the Simulator since it was first released since it still needed a lot of work.
Thank you for this great tutorial. I have tried setting up Action based rig and it works correctly except when I added open vr loader for steam build, and nothing is working anymore. Is there a way to use action based xr rig with open vr loader?
Yeah, you'd just need to write a custom controller. If you look up SteamVR Inputs for XR Toolkit, I have a video on how to do it.
@@VRwithAndrew Thank you, I found it and it's a great tutorial.
The only reason I wanted to check Open VR Loader, is because the steam vr micro transaction window doesn't appear when using OpenXR, even though everything else including the action based rig works correctly on steam build as long as open vr loader is unchecked and openxr is checked. Is there a way to get in app purchase screen and steam menu button to work with the app without having to check Open VR loader?
@@Faris-1900 I'm not sure! I feel like it would be possible to manually invoke that stuff if you have the files from SteamVR.
@@VRwithAndrew Thank you I appreciate it. I do have the files from SteamVR and I have steam works. But when I call steam works micro transaction the purchase window appears on desktop rather than in VR if Open XR is checked while Open VR Loader is not checked and the native steam vr menu doesn't open clicking the menu button on left controller. Everything else including xr interaction toolkit works perfectly. That leads me to believe a simple tweak might fix this and in that case I wouldn't have to do a hack to create steam vr build.
Hi, I am following the tutorial so far but when I try testing it says actions.json file not founf. I was wondering if its a version problem. Would appreciate any help!
Hmmm, I'd try re-importing the Sample Files that contain the actions and try re-applying them.
@@VRwithAndrew hey thanks for your reply! I'll try that and check. The tutorial is awesome so far :D
@@riamatapurkar6844 Thank you, Ria! Happy to help!
@@VRwithAndrew but also another interesting thing that happens is I can't see the plane with the headset, I can only see the horizon and it moves with my headset. Know what I mean? I tried adjusting the position of the plane but it keeps reverting back for some reason
@@riamatapurkar6844 Hmmm, that usually happens if something is wrong with your tracking. Or, there's a camera in the scene that is rendering to the screen rather than the headset. I'd try resetting the hardware, and making a new scene with the simple XR Rig.
will XR work on a WebGL build
Not out of the box, you'd need some additional witchcraft from the internet.
I am currently doing the Unity Create with VR pathway, using a Valve Index; my problem is, that "Select" seems to be called the second I just brush against the grip sensor of the Index controllers, instead of using the grip force and a threshold like in any game I have seen so far.
Is this standard behaviour for the OpenXR toolkit and if so, how do I change that?
If you go into the Input Manager, there should be a Press Threshold for that Input Action. That may solve your problem.
@@VRwithAndrew The Input Manager seems to be disabled when using the XR Interaction Toolkit.
I have been digging around for a few hours now and it seems that the problem is with SteamVR itself. The problem is that Select Action is called by gripPressed, which according to the Unity OpenXR documentation is defined by the device manufacturer.
So I went to the button mapping screen of SteamVR when running my project and I found out that they indeed mapped the "grip" value (which determines how much you wrapped you hand around the controller) to trigger "gripPressed" when it exceeds 25%.
You could make your own binding in SteamVR, but that would mean that every player of the game would need to do that themselves (since without the SteamVR plugin, you cannot ship your custom binding with the game) and it doesnt even seem to be possible to even bind "gripPressed" to use "gripForce" (so that you actually need to put pressure on the controller).
So I went to make a custom binding in the Input Action Asset (which Unity recommends anyway and explicitly states to NOT use the standard "XR Controller") and use "gripForce" to trigger Select Action. This somewhat worked but left me with a few other problems:
1. The standard haptic feedback from the SteamVR binding is still triggering
2. You cannot set a different grab and release threshold, which can lead to "grabbing jitter"
3. Grabbing stuff still doesnt feel right like it does natively with the SteamVR Plugin
I am kinda starting to get frustated to the point of giving up on OpenXR and just grind along with the SteamVR Plugin, even though Valve stated that they shifted development to OpenXR and wont work on the SteamVR Plugin anymore (which I fear could lead to problems with current or upcoming Unity versions).
Apparently they split the hand control Default Input Actions out into three different maps for each hand - imgur.com/gallery/1n5Q9NR
Ah! I haven't seen that yet.