This is really great to see, super helpful to be taken through the input actions. Also love the little popup video of yourself in the static moments, great idea! Keep them coming, Thank you!
THANK YOU. THANK YOU THANK YOU THANK YOU!!!! I've never used Unity before, but I'm using it to Prototype an idea I have for my VR headset. This was exactly what I wanted to do (run scripts based on inputs) and I finally found your video that explained exactly what I wanted to do. *Well, not /exactly/ but 95% close enough! I'm using the sticks primarily, and float2 is a new datatype for me. (I come from an embedded real-time hardware background) The documentation is unclear because it often assumes background knowledge on several other bodies of documentation and I couldn't hold all that in my head. I'm struggling to understand the necessary link between dropdowns in the Unity Editor and getting code to run... and then actually writing the code to do what I want! This video (and the two you referenced at the beginning): Super helpful. Thank you so much!
what a nice tutorial video! really detail but easy to understand ! and the tempo、voice really makes me feel ease to learn not getting annoyed. I got more to learn. hope to see more about this kind of video. thank you Andrew.
I followed all the steps for the primary button input action, but nothing is triggering (print statement inside Toggle isn't going off). Are there any additional settings that I should be looking out for? Things I've done so far: -set input handling to New and Both in player settings. -added all controller profiles in the XR Plugin Management settings. - added action.Enabled() and action.Disabled() in OnEnable() & OnDisable() respectively.
I doubt this is still relevant to you, but I just had the same issue. The fix was to change the button binding path (03:37) from "primaryButton" to "menu". I also added both left and right xr controller bindings to the same action.
Hello, Andrew! A few questions: 1. Do you use OpenXR for Vive controller in this video? I've tried HTC Vive with OpenXR and it only been a bunch of errors at launch in Unity console. Oculus and Samsung HMD were working fine with native Unity plugins/checkboxes. 2. So what you've shown lets us use any button on our specific controller for our VR purposes. Is this your workaround for pretty limited number of buttons in XR Toolkit controllers? If I remember correctly, XR toolkit controller only has eyes, axis, grip, trigger, UIpress and haptic by default. If we want full functionality for more buttons, would you suggest inheriting from XR controller class and adding new button actions? Or better not bother with it and just subscribe to action events?
I just use the Index Controller for this video. But, I've mapped Vive/Index controllers for input in a single project. It depends upon the context, the method I show here is fine if you're just looking for some basic functionality. If you're trying to add an additional Active or something to an Interactable. That would need a custom controller. I've done for another project, but it's not super intuitive.
I start to like this new input system. I have a setup where I mock the VR controller, so that I don't have to put on the headset for every simple change. With this input system I can just map the already exising Right/Left controller actions (for example Activate) to additional keyboard inputs without writing any additional "VR Mock" code. Now I only have to figure out how to map the stick axis to 4 buttons (Up, Right, Down, Left), I could do it in code and just check the axis position, but I assume there is a way to already do this in the input system editor, to handle it as a proper button instead of axis vector.
Great video, clear and well paced. One thing I'm unclear on is if you have multiple Action Maps defined and they have overlapping Actions, what happens? Does the first one fire but not the rest, or do they all fire. If all, is there a sequence to the executions? If you have time to answer, thanks.
They would all fire, I've sometimes used inputs that are specifically tied for hand/finger animations. I'm not sure about the sequence of execution though.
Heya, I've just watched through the whole video and it's cool to see how powerful this can be when it's applied correctly. I want to ask, you seem to be using Index controllers, so do you know of any way to get the finger tracking data with Index? I can't figure out a way to have VR hands with the correct hand positions with OpenXR. Ideally in future it would be cool to be able to bind certain things to certain hand gestures.
Yeah, unfortunately, that's only available through SteamVR/OpenVR at the moment. That's as of my most recent knowledge. I do hope it gets added in the future.
Does a value *have* to change for it to run "performed" ? I mean, even though a pressed button has value 1 all the time, as long as it is 1, isn't it still being pressed, thus performed? I would have expected it to be called "changed" if they meant it that way.
Hey I'm looking to hold down a button on my controller (Maybe the grip or index button), to drag around a plane object which stays where it is, when I let go.. Like dragging a window around in windows. Do you have any recommendations for that? Thanks
You can use a Grab Interactable with the gravity turned off, and the throw functionality. You should be able to press and hold to grab/drop the object.
Each controller has an Action Map named XRI LeftHand and XRI RightHand. Each Action Map has a list of Actions, like Select, Move or Activate. Do you know how to get the Move of one of these controllers? Can you point me in the right direction with docs or something, please? I tried obtaining the XRController or ActionBasedController components from the GameObject but there is no Move property or action or whatever it is named. Thank you! Great channel by the way!
The nomenclature is a little confusing. Valve's API is OpenVR, not to be confused with OpenXR; This is the Khronos Group's standard for VR and AR. The goal of OpenXR is to allow developers to target many devices. OpenXR is supported on both the Quest and Index, and on many other devices, and probably many more to come. That's my understanding of it all, I'm not very experienced.
Thanks for this, Andrew. Your videos are a big help and time saver! 👍 Maybe you could do a deep dive into the "Interactions" area of the XR Toolkit. I'm absolutely not sure if I'm using it correctly as I only can make one Interaction work at a time for a given Action (e.g. I can't combine "Tap Interaction" with "Multi Tap Interaction" or "Press" Interactions on one Button). The official Documentation is a bit confusing and I haven't found anything else to solve my Problems yet. 🤪Best Wishes for your Channel!
@@VRwithAndrew Actually after writing that, I saw that apparently according to Unity thumb touch should work but not the other fingers (not sure about index finger). Unity told me Valve would have to release an extension for this to work.
If I wanted to update my scripts but only change as few lines as possible... what would be the quickest way using the "new input system" to get the status of Left Controller Trigger Button etc? I have an entire system that needs updating now, and it updates that status of buttons in one function and fires events ( very complex but very fast) and I would prefer to keep this vs Unitys system as it seems theirs bogs my system down a bit more.
There are a lot of different ways of doing it. You may want to look at this Quick Start guide to see what option may work best. docs.unity3d.com/Packages/com.unity.inputsystem@1.0/manual/QuickStartGuide.html
@@VRwithAndrew Any chance you are building a VR Platform to sale on Unity Asset Store? xD btw good advice going out in Unity Discord :) sure a lot of people appreciate it
@@RickMcMichael Maybe at some point! I do have a project on Patreon that I've been keeping some projects updated with. But, I have a lot of other projects I'd like to explore till XR Toolkit releases officially.
Hi Andrew, Would you be able to do a video on getting the Vive trackers working in this new system, I am really struggling to get the trackers working.
I've seen this request a lot, and I still need to look into it. It may be more beneficial than I expect. However, I only have one tracker so I probably can't go into too much detail on it.
That's when you would need to go through the Inputs on the Controller. It isn't super easy to make a custom controller. But, there may be a simpler alternative way of having the script check for a target interactor, and it's hovering or selecting an object. Then doing some additional functionality.
These videos on XR toolkit are perfect! As a newbie to unity, it's really hard to figure this out from docs (which are sometimes obsolete), so thank you, you've saved heaps of time. One question though: I have a frustrating behaviour where when the game starts, most of the time the HMD looks at a fixed point with two lasers coming out of each eye. It doesn't matter where I move my head, the visual is identical, and not tracking. It's like it doesn't reliability start. When I take off the headset and look at the PC window, it is usually semi-functional, in that the headset still points at a fixed location, but at least the controllers and their laser pointers work. Occasionally it does work correct to start. Does this sound like I've set up something wrong? Any tips are what to look at?
@@mrmorphic Yeah, unfortunately I don't have much first hand experience with WMR. I can only say to double-check your Unity settings, and try giving OpenXR or the Standalone WMR settings a shot.
@@VRwithAndrew all good. Playing around with it, seems to be related to input system. In WMR you can toggle inputs between desktop and VR, and it works once I started playing and toggle it. I'll see if there is a way to force it.
Hey, it seems like it doesn't fully work for me. I wrote toggle script, mapped it to left controller primary button, and in play mode, whenever I press it, my gameobject disappears, but it won't go back to active state again. It stays disabled, how do I fix it? I did everything like you in the video :/
You may just need to double-check the Toggle Code so that the bool is flipping properly. Check to see if you have the exclamation point is being using on the isActive line.
the way you implemented the color changing functionality, its like you press the trigger to its maximum value its color changes and as soon as you release it, it changes back. I want to do something like this in which when I press the trigger value , the color should change according to that, but it shouldnt revert back when i release the trigger. Can anyone help me with that?
I'd probably have a variable in the script for holding onto the max amount the trigger is being pulled. So, when the trigger is pressed, I'd check to see if the trigger value were getting is greater than the max value. Then, update the material, and update the max amount variable.
The code was a bit too much but I'm starting to get the hang of it. Could you put the output before the start of the video/coding? It made more sense when I realised what you were implementing.
Gotcha! I used to have comments outlining the code before actually doing the implementation. But, I seem to forget something when I do change up the video style.
Heyy!! thank you for this tutorial! :) I have an issue with it :( On Link everything works properly, but once built fro oculus quest 2, the controllers(only buttons) do not respond :( Do you know how I could fix it?
@@VRwithAndrew ive tried adding the hold interaction but it seems to not work, heres the script im using using System.Collections; using System.Collections.Generic; using UnityEngine.XR.Interaction.Toolkit; using UnityEngine.InputSystem; using UnityEngine; public class EchoMovement : MonoBehaviour { public Rigidbody playerRigidbody; public InputActionReference boost; public ActionBasedController controllerRight; public ActionBasedController controllerLeft; public float boostForce; private void Awake() { boost.action.performed += BoostRight; } private void OnDestroy() { boost.action.performed -= BoostRight; } private void BoostRight(InputAction.CallbackContext context) { Vector3 targetDir = controllerRight.rotationAction.action.ReadValue() * transform.forward; playerRigidbody.AddForce(targetDir * boostForce * Time.deltaTime); } } the hold interaction is on the action, not the keybind and is set to button with inital state check off
@@VRwithAndrew :))) There are many tutorial videos on UA-cam. Almost all of these are mouse-operated codes. The same does not work in VR Touch Controller. :) Or I could not run it. :)))
Thank you for this video. I am a beginner. I just can‘t toggle a particle system on an off in VR. Could you or Somebody else help me please. I am about to get insane.
Dude.... Your beard is so majestic and amazing
Ahaha, thanks homie, I grow it myself.
This is really great to see, super helpful to be taken through the input actions. Also love the little popup video of yourself in the static moments, great idea! Keep them coming, Thank you!
Much appreciated, thank you!
Every time I hit a snag, Andrew releases some amazing video and I'm able to learn so much. Thank you for doing this.
Great to hear!
you are the only person who has actually helped me understand how to use the new input system. thanks!
You're very welcome!
Really underrated video, super clear explaining! Great job!
Much appreciated!
Incredibly helpful, thanks Andrew!
THANK YOU.
THANK YOU THANK YOU THANK YOU!!!!
I've never used Unity before, but I'm using it to Prototype an idea I have for my VR headset. This was exactly what I wanted to do (run scripts based on inputs) and I finally found your video that explained exactly what I wanted to do.
*Well, not /exactly/ but 95% close enough! I'm using the sticks primarily, and float2 is a new datatype for me. (I come from an embedded real-time hardware background)
The documentation is unclear because it often assumes background knowledge on several other bodies of documentation and I couldn't hold all that in my head. I'm struggling to understand the necessary link between dropdowns in the Unity Editor and getting code to run... and then actually writing the code to do what I want!
This video (and the two you referenced at the beginning): Super helpful. Thank you so much!
Don't worry, the Input System documentation goes over my head at times as well.
Thank you sooo much I was looking for this for many months
Happy to help, Abdul!
This was really helpful for our university project, thanks a lot man
Happy to help, Benjamin!
what a nice tutorial video! really detail but easy to understand !
and the tempo、voice really makes me feel ease to learn not getting annoyed.
I got more to learn. hope to see more about this kind of video.
thank you Andrew.
You're very welcome Ryan, thanks for the kind words. :)
Thank you so much. I've been tearing my hair out for a while trying to do this and now it's working.
Happy to help! I don't want you losing all your hair and looking like me.
This video is awesome. Super simple to get to grips with. Shoutout to you my friend :)
Happy to help!
Thanks so much for this video!! Quite difficult to find videos on this right now.
Happy to help, Victor!
thanks! I used this to give the player footstep sounds! very useful
Happy to help!
What a king! 👑 Thanks Andrew.
You're welcome, thanks for the crown!
How topical. Just writing a research paper on accessibility in VR and was doing this earlier, look forward to this thanks for sharing
Glad it was helpful!
@@VRwithAndrew All of your work is. Get through a video a day to see what ive missed. Keep it up dude youre great
THANK YOU Sooo much!! It helps me a lot👍Your explanation is detailed, so even beginners can easily follow it!
Happy to help, Jihyeon!
Thanks mate, finally I got it working, it is so much complicated than the previous system hehe
You'll get the hang of it! They do need to do a bit more effort in phasing out some of the inputs. Takes a bit too much trial-and-error at the moment.
Thanks, Andrew, this is beneficial information.
Glad you think so!
Your video really helps me a lot. All your tutorial video is awesome !!!
Really thank you💖
Happy to help!
I HATE U!!! THIS IS AMAZING!!! I wanted this months ago and u are the first person that teach me this 🎉🎉🎉🎉🎉
Love you too, Jose :)
This is exactly what I was looking for, thank you so much
You're very welcome!
you are a saint and a scholar
I do what I can
Thanks for your help !
Thanks Andrew!!!
You're very welcome!
nice I really want to figure out how I can use a custom pose using xr hands to trigger a custom input action for things like this
Great tutorial as usual!
Much appreciated, Tim!
Golden Content!
Much appreciated!
Thank you so much for this.
Happy to help!
Great Tutorial!
Glad you think so!
I love you so much you are my saviour !
Happy to help, Victor!
I followed all the steps for the primary button input action, but nothing is triggering (print statement inside Toggle isn't going off). Are there any additional settings that I should be looking out for?
Things I've done so far: -set input handling to New and Both in player settings. -added all controller profiles in the XR Plugin Management settings. - added action.Enabled() and action.Disabled() in OnEnable() & OnDisable() respectively.
Double-check the OpenXR settings that you have the right Controller Profiles added.
@@VRwithAndrew Yes, they’re all there.
I doubt this is still relevant to you, but I just had the same issue. The fix was to change the button binding path (03:37) from "primaryButton" to "menu". I also added both left and right xr controller bindings to the same action.
Great help, thank you!
You're quite welcome!
Hello, Andrew! A few questions:
1. Do you use OpenXR for Vive controller in this video? I've tried HTC Vive with OpenXR and it only been a bunch of errors at launch in Unity console. Oculus and Samsung HMD were working fine with native Unity plugins/checkboxes.
2. So what you've shown lets us use any button on our specific controller for our VR purposes. Is this your workaround for pretty limited number of buttons in XR Toolkit controllers? If I remember correctly, XR toolkit controller only has eyes, axis, grip, trigger, UIpress and haptic by default. If we want full functionality for more buttons, would you suggest inheriting from XR controller class and adding new button actions? Or better not bother with it and just subscribe to action events?
I just use the Index Controller for this video. But, I've mapped Vive/Index controllers for input in a single project.
It depends upon the context, the method I show here is fine if you're just looking for some basic functionality. If you're trying to add an additional Active or something to an Interactable. That would need a custom controller. I've done for another project, but it's not super intuitive.
Very good tutorial video.
I'm glad you managed to finish it.
I start to like this new input system. I have a setup where I mock the VR controller, so that I don't have to put on the headset for every simple change. With this input system I can just map the already exising Right/Left controller actions (for example Activate) to additional keyboard inputs without writing any additional "VR Mock" code.
Now I only have to figure out how to map the stick axis to 4 buttons (Up, Right, Down, Left), I could do it in code and just check the axis position, but I assume there is a way to already do this in the input system editor, to handle it as a proper button instead of axis vector.
Yeah, you can use Sectors on a Vector2. I couldn't find a way of doing the same with a touchpad so I ended up making my own thing.
Thanks for this useful tutorial!!!
Happy to help!
Great video, clear and well paced. One thing I'm unclear on is if you have multiple Action Maps defined and they have overlapping Actions, what happens? Does the first one fire but not the rest, or do they all fire. If all, is there a sequence to the executions? If you have time to answer, thanks.
They would all fire, I've sometimes used inputs that are specifically tied for hand/finger animations. I'm not sure about the sequence of execution though.
Heya, I've just watched through the whole video and it's cool to see how powerful this can be when it's applied correctly.
I want to ask, you seem to be using Index controllers, so do you know of any way to get the finger tracking data with Index?
I can't figure out a way to have VR hands with the correct hand positions with OpenXR. Ideally in future it would be cool to be able to bind certain things to certain hand gestures.
Yeah, unfortunately, that's only available through SteamVR/OpenVR at the moment. That's as of my most recent knowledge. I do hope it gets added in the future.
Does a value *have* to change for it to run "performed" ? I mean, even though a pressed button has value 1 all the time, as long as it is 1, isn't it still being pressed, thus performed? I would have expected it to be called "changed" if they meant it that way.
Hey I'm looking to hold down a button on my controller (Maybe the grip or index button), to drag around a plane object which stays where it is, when I let go.. Like dragging a window around in windows.
Do you have any recommendations for that? Thanks
You can use a Grab Interactable with the gravity turned off, and the throw functionality. You should be able to press and hold to grab/drop the object.
Thank you so much !
Happy to help!
Each controller has an Action Map named XRI LeftHand and XRI RightHand. Each Action Map has a list of Actions, like Select, Move or Activate.
Do you know how to get the Move of one of these controllers? Can you point me in the right direction with docs or something, please?
I tried obtaining the XRController or ActionBasedController components from the GameObject but there is no Move property or action or whatever it is named.
Thank you! Great channel by the way!
You wouldn't be able to get it through the controller easily. It would probably be more straightforward to reference the Input Action Asset directly.
a stupid question, does this work on Oculus quest as well ? im thinking openXR is valves API ?
The nomenclature is a little confusing. Valve's API is OpenVR, not to be confused with OpenXR; This is the Khronos Group's standard for VR and AR. The goal of OpenXR is to allow developers to target many devices. OpenXR is supported on both the Quest and Index, and on many other devices, and probably many more to come. That's my understanding of it all, I'm not very experienced.
This is true, OpenVR now also follows the OpenXR standard from my understanding.
@@VRwithAndrew & @Tri Stan , thank you guys, for clearing it out :)
thx Andrew!!
You're welcome!!
Thanks for this, Andrew. Your videos are a big help and time saver! 👍 Maybe you could do a deep dive into the "Interactions" area of the XR Toolkit. I'm absolutely not sure if I'm using it correctly as I only can make one Interaction work at a time for a given Action (e.g. I can't combine "Tap Interaction" with "Multi Tap Interaction" or "Press" Interactions on one Button). The official Documentation is a bit confusing and I haven't found anything else to solve my Problems yet. 🤪Best Wishes for your Channel!
Thanks! I'll see if I can look into it. A lot of it is just trial and error, and going through some of the code in the package.
Great Vid Andrew, are you going to be doing one about getting thumb touch & finger tracking working on OpenXR?
Yeah, if it becomes available, not sure about the status of it right now.
@@VRwithAndrew Actually after writing that, I saw that apparently according to Unity thumb touch should work but not the other fingers (not sure about index finger). Unity told me Valve would have to release an extension for this to work.
If I wanted to update my scripts but only change as few lines as possible... what would be the quickest way using the "new input system" to get the status of Left Controller Trigger Button etc? I have an entire system that needs updating now, and it updates that status of buttons in one function and fires events ( very complex but very fast) and I would prefer to keep this vs Unitys system as it seems theirs bogs my system down a bit more.
There are a lot of different ways of doing it. You may want to look at this Quick Start guide to see what option may work best.
docs.unity3d.com/Packages/com.unity.inputsystem@1.0/manual/QuickStartGuide.html
@@VRwithAndrew Any chance you are building a VR Platform to sale on Unity Asset Store? xD
btw good advice going out in Unity Discord :) sure a lot of people appreciate it
@@RickMcMichael Maybe at some point! I do have a project on Patreon that I've been keeping some projects updated with. But, I have a lot of other projects I'd like to explore till XR Toolkit releases officially.
Hi Andrew, Would you be able to do a video on getting the Vive trackers working in this new system, I am really struggling to get the trackers working.
I've seen this request a lot, and I still need to look into it. It may be more beneficial than I expect. However, I only have one tracker so I probably can't go into too much detail on it.
@@VRwithAndrew hi and thanks for the reply, even getting one working would be highly beneficial and I would appreciate it so much.
great video, and beard 😊
Haha, thanks Paul :)
How would you adjust the script so that you can only toggle or change color while it is being hit by the ray? So not every action causes the change?
That's when you would need to go through the Inputs on the Controller. It isn't super easy to make a custom controller. But, there may be a simpler alternative way of having the script check for a target interactor, and it's hovering or selecting an object. Then doing some additional functionality.
These videos on XR toolkit are perfect! As a newbie to unity, it's really hard to figure this out from docs (which are sometimes obsolete), so thank you, you've saved heaps of time. One question though: I have a frustrating behaviour where when the game starts, most of the time the HMD looks at a fixed point with two lasers coming out of each eye. It doesn't matter where I move my head, the visual is identical, and not tracking. It's like it doesn't reliability start. When I take off the headset and look at the PC window, it is usually semi-functional, in that the headset still points at a fixed location, but at least the controllers and their laser pointers work. Occasionally it does work correct to start. Does this sound like I've set up something wrong? Any tips are what to look at?
I should add that its an HP reverb, windows mixed media. Works fine on all other VR apps.
@@mrmorphic Yeah, unfortunately I don't have much first hand experience with WMR. I can only say to double-check your Unity settings, and try giving OpenXR or the Standalone WMR settings a shot.
@@VRwithAndrew all good. Playing around with it, seems to be related to input system. In WMR you can toggle inputs between desktop and VR, and it works once I started playing and toggle it. I'll see if there is a way to force it.
Thank You so much!!!!!!
Happy to help!
Many thanks
You're welcome!
Nice Tutorial
Thanks, homie
Thank you!!
You're welcome!
Thanks alot!
Happy to help!
MY CommonUsages doesnt look at all similiar to yours. D you know why this might be?
I'm not sure, I don't know if there have been any big changes to the Input System. What version of Unity are you using?
Hey, it seems like it doesn't fully work for me. I wrote toggle script, mapped it to left controller primary button, and in play mode, whenever I press it, my gameobject disappears, but it won't go back to active state again. It stays disabled, how do I fix it? I did everything like you in the video :/
You may just need to double-check the Toggle Code so that the bool is flipping properly. Check to see if you have the exclamation point is being using on the isActive line.
the way you implemented the color changing functionality, its like you press the trigger to its maximum value its color changes and as soon as you release it, it changes back. I want to do something like this in which when I press the trigger value , the color should change according to that, but it shouldnt revert back when i release the trigger. Can anyone help me with that?
I'd probably have a variable in the script for holding onto the max amount the trigger is being pulled. So, when the trigger is pressed, I'd check to see if the trigger value were getting is greater than the max value. Then, update the material, and update the max amount variable.
I'm confused about the joystick controller.. how do I detect it?
You'd need to set up an action of type Value > Vector2. You can see an example of this within the Default Actions if you look for the "Turn" action.
The code was a bit too much but I'm starting to get the hang of it. Could you put the output before the start of the video/coding? It made more sense when I realised what you were implementing.
Gotcha! I used to have comments outlining the code before actually doing the implementation. But, I seem to forget something when I do change up the video style.
Heyy!! thank you for this tutorial! :) I have an issue with it :( On Link everything works properly, but once built fro oculus quest 2, the controllers(only buttons) do not respond :( Do you know how I could fix it?
In the XR Plugin Manager, there should be a tab for your Android build. Make sure you've set that up as well.
Thanks
You're welcome
That was helpful! May I ask how old you are?
Ahaha, I'm 31.
Hey can i have the link for this project ?? thanks great WORK ^_^
how do i deal with holding buttons?
You can try looking into isPressed, or in the Input Action Manager, you can add an Interaction to an Action called "Hold".
@@VRwithAndrew ive tried adding the hold interaction but it seems to not work, heres the script im using
using System.Collections;
using System.Collections.Generic;
using UnityEngine.XR.Interaction.Toolkit;
using UnityEngine.InputSystem;
using UnityEngine;
public class EchoMovement : MonoBehaviour
{
public Rigidbody playerRigidbody;
public InputActionReference boost;
public ActionBasedController controllerRight;
public ActionBasedController controllerLeft;
public float boostForce;
private void Awake()
{
boost.action.performed += BoostRight;
}
private void OnDestroy()
{
boost.action.performed -= BoostRight;
}
private void BoostRight(InputAction.CallbackContext context)
{
Vector3 targetDir = controllerRight.rotationAction.action.ReadValue() * transform.forward;
playerRigidbody.AddForce(targetDir * boostForce * Time.deltaTime);
}
}
the hold interaction is on the action, not the keybind and is set to button with inital state check off
🖐️Can you make a video tutorial on 'using hover' in the next video?💯
📍The reason: we cannot run mause codes at VR.
🥽You can also give a hint.
Ahaha, probably need a bit more info on what you mean by "Hover".
@@VRwithAndrew :))) There are many tutorial videos on UA-cam. Almost all of these are mouse-operated codes. The same does not work in VR Touch Controller. :) Or I could not run it. :)))
Nicee
Very nice!
Thank you for this video.
I am a beginner. I just can‘t toggle a particle system on an off in VR. Could you or Somebody else help me please.
I am about to get insane.
Early gang
Gang
Thanks Andrew!!
Happy to help!