I may sound completely silly when I ask this, but how come you can't have it assume that close to zero movement EQUALS zero movement? As in, configure a bit of a dead zone, like < 1cm per second = 0m/s. That would never work for good head tracking in VR but it would stop this drift surely. Ultimately it remains unfit for VR but it is better than nothing at all, isn't it?
The basic property required of positional tracking is that if you start at point X, then move away to point Y, and back to point X, the tracker knows that you're back at the original point. Now say you implement a dead zone of 1cm/s. You start at X, move quickly one foot to the right, and then slowly back to the left, at less than 1cm/s. So on the return trip your motion is ignored by the dead zone, and even though you're back at X, the tracker thinks you're still one foot to the right. That's the fundamental problem.
You can run a band pass filter with the high frequency noise being cut and the very low frequency drift being cut also. Working with high quality imu specifically for position has worked well in the pass and from my experience a good khalman filter works great along with band pass filters
Aircraft and submarines have used precise IMUs for decades. The old ones used mechanical gyroscopes and the new ones use ring laser gyros. How can it work for aircraft, but not for this?
The positional drift of a high-quality inertial navigation system as used in ships or aircraft is on the order of 0.6 nautical miles per hour -- according to Wikipedia, so make of that what you will. But 0.6 nm/h is about one foot per second, approximately, and pretty similar to what we're seeing here.
okreylos Good point. .6nm/hr isn't too bad for an aircraft, but for something sitting in front of a TV it wouldn't work. Aircraft can "realign" their INU in flight using GPS or ground based radio navigation aids which is the equivalent of what you are doing with external 3d reference.
Did you try subtracting from the mean, to compensate for noise + gravity?? I can see that it's actually working really great. Maybe it just needs some good DSP. There is also the possibility of training a model to compensate for the drift, by training it using an external reference, like a camera. Which is exactly what these guys have done and got really exiting results - ua-cam.com/video/IroLp5VOPDE/v-deo.html (I've also been trying to do this for a long time)
I have a couple questions here as I notice the drift error appears to follow a specific linear trajectory. First, did you ensure the accelerometer was calibrated before you began using it? These units will not always be calibrated in manufacture, that is the physical alignment of the sensor axis' at the lowest level both relative to each other (orthogonal) and the housing (gravitational vector laying perfectly on the - z axis when the unit is sitting on a level surface). As without having done this you would expect a constant source of error that is not the result of noise yes? Also, how have you set up the position calculation, does it integrate the total historical signal or is it iterative over a very small recent window? In any noisy system windup will occur if all of the past samples are being integrated twice. You will still have windup error but it should be minimized relative to the sample window size? I know in theory dead reckoning alone with IMU will always have errors without an external reference, but as you say you spent little on this. Could not the results be considerably improved with some effort? I am sitting down to start implementing this myself currently. One idea I am toying with is to set a trigger level near the noise threshold so that zero velocity is maintained at readings that are near zero. This would introduce discrete errors at start and stops or for extremely slow and maintained motion. But overall an improvement and likely suitable for many applications? Furthermore, retroactive extrapolation may also be used to 'fill in' the threshold dead zone using the velocity upon crossing the threshold boundary... Also, another idea I have... to create a frame of reference that is based on the assumption that the device never leaves a particular work space. For example a user sitting a chair. Any arbitrary 'global' reference point could be set and the entire derived trajectory could be transformed to never stray to far from the origin. Somewhat of a crude KF process model making use of some known assumptions about the application similar to using foot falls during walking to set zeroed points. For an application using a VR headset that involves the users sitting, you at least know a user's butt will never move. Therefore the entire trajectory should lay within some general area. Any detected drifting towards boundaries could be evaluated and perhaps the magnitude could be distributed and subtracted across the whole historical signal? Sort of a 'counter drift' to drive back towards zero. Approaches like this could never work on a boat or a plane and and to a large degree a car (the is no environmental current and the wheels can not traverse laterally in this case), but I would think in many human applications, human behavior may to some degree be predictable, varying from application to application of course. Also, as another person mentioned, the use of two IMUs where the fixed distance between them is known and be used to compensate between units. I would think that should be a defacto standard in the construction of any device like a VR head set, they are small and cheap enough to warrant the extra hardware given the potential improvements in application?
Thanks for the demonstration and explanation! But isn't the problem still "just" a practical problem of increasing precision? Obviously a threshhold value for velocity, maybe filtering out movements that pass along a straight line (~= drift) you could remove this drift problem. Then the only problem remaining would be precision. If you increase precision / update speed fast enough (maybe by a factor of 10, maybe by just combining more IMU chips in different orientations) then it could be usable. You would still get the same errors of course but they could be small enough to not affect the experience.
If you have 'n' amount of IMUs (n being something large like 10 or 20), would their drift almost cancel out or will they drift in the same direction, I have a feeling they would cancel out as this is random error and not systematic error.
@@danteregianifreitas6461 50 seperate IMUs or the same one 50 times every 10 ms? The drift moment to moment of a specific IMU seems to be consistent-ish, but it does seem plausible that many separate IMUs could average out to a useable level.
Could you go more into depth about how you created the orientation tracker, or link me to some resources? I understand that you integrate the gyroscope readings and combine them with the down vector from the accelerometer and the readings from the magnetometer, but did you use a specific algorithm such as the Madgwick filter or the Kalman filter? If not do you know where I could find resources on the specific formula you used? I'm trying to implement a similar tracker with my own IMU. I know this video is a bit old, but I hope you still see this
The orientation tracking algorithm in my software has been changing over the years, and I don't remember exactly what it was when I recorded this ad-hoc video. The current version is an improved Madgwick filter (I simplified the way he integrated magnetometer readings), but I think the video is still using the original version. Either way is a good starting point. Orientation-only tracking or positional dead reckoning (which doesn't work, as shown here) does not need the full power of the Kalman filter. That only comes into play when you want to fuse IMU predictions and corrections from an absolute 3-DOF or 6-DOF tracking system: ua-cam.com/video/-nsylEpgVek/v-deo.html
@@okreylos Ah okay. That linked video was super informative, thank you so much. So I have been trying to implement a madgwick filter and it's approximately correct, but I get a large amount of drift, like a full rotation leads to about 20 degrees of heading drift. I am using a cheap arduino that comes with an IMU sensor (LSM9DS1), do you think there is an error in my implementation, or could it be that I need a better sensor? Thanks so much
The noise seems to be accumulating in one direction. If you could compensate for it, maybe it could work. You can calibrate the compensation by measuring the noise at the static position.
The drift looks like a constant acceleration as in gravity, and the real motion like a variable acceleration. Like gravity the two motions seem to just add together.
I want to make an indoor positioning program for an autonomous robot(on a 2d map as if you were seeing it from above).Do you think it'll have the same kinds of problems?
I want to develop a motion capture system with IMU I want to know which algorithm is better for sensor fussion AHRS or EKF? One more thing I also want to know the softwares which r required for displaying the result on pc or laptop is there any specifc software. or which software you used for displaying the results?
Awesome results man, thanks for saving my time and many people's time too. Some times we need to proof that things do not work rather than proof that they do, thumbs up.
Is it possible to correct the drift from the IMU by setting an external magnetic sensor as the reference point (zero) of the magnetometer in the IMU. For example, tracking the position of the wrist, by having the magnetic field next to the magnetometer of the IMU, the zero or reference point will always stay at the position of the wrist.
Excuse my ignorance but would this not work if you wrote a starting threshold value that is higher than the drift value? So the drift is below the threshold value and therefore would not register as a movement but when really moving your head you would be above the threshold and therefore the movement would be registered?
It would work, but one of the reasons for positional tracking is to prevent motion sickness, and therefore tracking small movements like swaying back and forth slightly is key. So it wouldn't be practical unless you had a high motion sickness tolerance.
Due to integration, very small accelerations add up to large displacements over time. If you threshold away the small ones, you create large offsets in position after a while.
***** This is noise that is inherent to integrating any variable to obtain a result value. Because you are not using fixed coordinates to compare (i.e. Fixed Camera) you are only measuring ACCELERATION in any given direction. So ACCELERATION needs to be integrated twice to get POSITION and hence the conundrum. Edit: It needs to be noted that there are no available possible means as of today or dare I say in the future, to directly measure POSITION or VELOCITY without reference frames. You can blame Einstein for that. lol. I think you are talking about "measurement noise" which is also inherent to any measurement. This noise however is not the problem we are facing here. They have that down to a pretty good state by using higher frequency sampling and interpolation.
A few things, the rotation doesnt drift because you can always tell down with gravity and the magnetometer helps too, as for linear drift you always seem to drift to the right indicating that something isnt calibrated properly. When properly calibrated you drift in random directions
How come one of the videos on Madgewick's channel using his sensor fusion algorithm shows positional tracking using just one IMU? Am I missing something?
Okay I'm late to the party on a 1 year old comment that's comparing this 5 year old video with a mention of a 10 year old video... but: * SebMadgwick is assuming that the footsteps he's recording are moving in a uniform distance. So the graphs are probably incorrect. * Also, he probably has drift in his sensors as well, but the drift is probably negligible in the visualization or calibrated out. * It looks to be that the footsteps are assumed to be forward in the Y. Basically he created a pedometer that has a gyro attached for direction... as far as I can tell. It's still impressive but it won't be accurate enough for real world.
What about two IMU Sensors, is there a possibility to meassure the distance between those two Sensors? That could be usefull for Motion Tracking. If you could get the position of all IMUs refered to one IMU (for example one at the foot) you would still have a drift but the motion would be correct.
Only if the two IMUs are connected by a rigid body, or a linkage of rigid bodies with one IMU per link. That way you can deduce spatial relationships from the IMUs' orientations. That's why IMU-based skeletal tracking can estimate the spatial relationships of limbs, but not the position of the entire body relative to the ground.
okreylos So, are you saying that if you had a headset with an accelerometer and gyroscope on either side, you could get the amount the user moves right, but the results would only be relative to where they were when the program started reading the data?
***** No, that doesn't work. What I mean is that if you place an IMU on your upper arm, and one on your lower arm, then you can track the position of your wrist relative to your shoulder.
Do you get any reduction in the drift if you combine the measurements of more than one IMU connected by a rigid body, or would there be just as many jitters getting combined as there would be canceling out on average? If that does reduce drift, how many would you need to combine to get less than a millimeter of drift within 1 hour?
I used Runge-Kutta and it doesn't drift. it only moves a little bit sometimes but comes back again. however i have problems with roll/pitch/yaw, but it might just be my problem because i'm pretty novice. I used MPU-9250 breakout board from sparkfun.
looks like an oldie so just curious if you fixed it. or still looking. for the latter, my 2 cents: the position can not be determined with accelerometer because the uniform motion can not be detected. a submarine has means to determine the speed therefore they can blindly calculate the position change from last known; they mostly go straight. in your case, you have no means to detect constant speed; you can only detect acceleration for which you may be able to calculate distance traveled, but when acceleration level downs to noise you cant tell the drifting. best would be to know that a gaming headset doesnt move away in the room and only calculate the attitude using the accelerometers, forget drift.
I don't know anything about anything, but couldn't a built-in compass deliver a fixed-position vector (even if it happens to point towards some overpowering stationary magnetic object than the geographic pole) which the others could update and calibrate against?
Thank You Very much for this video. I think I have a firm grasp of the problem now. Alas the conclusion is, "problem not solvable without an external reference frame". Bummer :( There goes the stand alone VR glasses dream. You can always integrate cameras to the Rift itself to detect external pointers, but that's very impractical at the moment.
What if you added information from just one camera (most people have a webcam) tracking the user, could you get enough information for this to be useful. Or do you need a better method for this to be useful at all?
Yes, single-camera tracking can be done. If you do camera tracking, merging in data from the IMU allows you to reduce the total latency of the tracker. It's what Oculus are doing with the DK2.
okreylos Great video, the guys at AntVR should have watched that before claiming full positional tracking :) I just posted it on MTBS3D and I understood AntVR are alredy shipping. Let's see what they deliver... I'd say Oculus should put a camera on the ceiling to have full 360 degree tracking, or at least have that as an option. The current implementation is only good for "Wheelchair VR" :)
konstantin lozev I made this video in response to AntVR's claims (which are of course completely bogus). What they call "one-step positional tracking," I call "virtual emetics."
This is a excellent analysis. Have you thought about non-obvious ways to overcome this? Example: Is there a "virtual" external reference frame that could be used?
+Jason Leung This isn't using Oculus' SDK at all. It's using a sensor fusion algorithm I made up. I've recently done a lot more research into this, and it turns out that my algorithm is on the same level as Madgwick's well-known fusion algorithm quality-wise, but requires about twice as many ALU operations per iteration. I've ditched it in favor of an improved version of Madgwick's algorithm since I made this video.
If you have an external system that can measure the IMU's absolute position in space at regular intervals, then you can correct drift. That's precisely how HTC Vive's and Oculus Rift's tracking systems work, using lasers and cameras, respectively.
Hey, I have a question that I figured I would ask you if you are available, since you gave me such a great answer to another question that I had. So I have developed a 3DOF positional tracking algorithm for my project using the Madgwick filter, and it works pretty nicely, with very little drift, but I have an issue where the positions of things in my 3D simulation depends on the position of the imu when the algorithm starts running, and I want to make it so that everything is in the same place every time, so that things will always be in the same direction, ie things in the virtual world will always line up with the same places in the real world. I'm pretty new to 3D math and I'm not super comfortable with using quaternions so I convert the quaternion values into pitch yaw and roll before passing them from my sensor into my 3D simulation and applying them to my camera. I've heard that pitch yaw and roll is generally expressed as rate of change rather than an absolute orientation and I thought that maybe if I used the quaternion directly rather than converting to euler angles I could get absolute orientation that wouldn't depend on the starting position, but after doing so my position tracking is super off. I think there's probably an error somewhere in terms of unit conversion or the coordinate planes being backwards or something like that that I can fix by doing some research about quaternions and 3D rotation conventions, but I'm not sure if I'm on the right track and that using quaternions instead of euler angles will accomplish what I want. Do you have any tips, suggestions or resources on how to accomplish this? Thanks!
Gyro and accelorometer are going to give you, your'e positition in the universe, but the earth is spining and is rotating around the sun, the sun spin in galaxy and the galaxy move around (even if we dont feel it we are moving very fast in all kinds of directions). If you dont compansate for those displacements of course the readings will drift. Did you try to place an IMU on a fix reference location (like your desk) and substract its apparent drift from your moving IMU?
On an ideal gyro the problem is not the gyro drifting but the earth drifting in angle and location in the universe while the gyro reference remain fix. This can be compansate either by computing your absolut position in the universe, pretty easy for the earth rotation on itself and around the sun more difficult for the position in the milky way and the milky way in the universe or by having a reference gyro at a fix location and substracting its apparent "drift" to your moving gyro. But I'll agree with you, the cheap gyros are far from ideal.
So, even with sensor fusion using Accelerometer, gyroscope, and magnetometer, it still wouldn't be enough? What if we wanted to track and iPhone position and we were using the camera sensor. Would tracking pixels do any good and use even more sensor fusion to compensate for drift and get accurate displacement from a set origin?
Are the movements in around the sun even that significant? You have a rotation rate of ~1°/day and the acceleration shouldn't be even measurable I think. The next bigger movements (eg. in milky way) should be even less significant. For most applications it should suffice to consider the rotation of the earth itself. Everything else is less than the drift of even the best of sensors today, or at least for these MEMS-sensors. Also, you can simulate these sensors with every noise effects in a perfectly stationary environment and it will still drift.
+Nathan Imig I tried Euler and Verlet integration. Both gave very similar results. The fast drifting problem is not due to integration error build-up, but due to errors in estimating the direction of gravity based on noisy sensor readings.
+okreylos so idk how ur ref frame is stable but ur gravity vector is noisy, that shouldn't be. i assume ur using euler angles which are filtered from fusion to get orientation of gravity vector. but then u used raw acceleration to take out the gravity vector?
+okreylos If ur not filtering the accelerometer somehow before removing gravity your going to just reintroduce the noise again (this was the same noise that was taken away w/ euler angles by fusing the angle's from accel, and angle's from gyro). If you could isolate the noise from accel it may be possible to run thru a type of low pass filter to remove more noise.
+okreylos oh and im not sure you really can say the integration technique isnt introducing a reasonable amount of error. it will always introduce error especially on impulsive movements.
+okreylos if it were true noise, it would not always be lagging the distances... i.e. draw a circular path and see if it always spirals inward; always spirals outward; both inward/outward spirals; or path is circular (this be the easiest method).
Thank you Oskar for this great video. Even thought I'm disappointed, you saved me lot of hours hassling around with IMU :). I would like to track multiple drones cheap. How comes these guys have such a good results? 3D Tracking with IMU
Since their IMU is foot-mounted, they can detect the impact when the foot hits the ground, and reset all velocities to zero at that point. That way drift can't build up like it does here. They only need to semi-reliably track displacement during the gait phase when the foot is off the ground, and that's pretty short during sustained walking.
This is not a problem that can be solved, not without an external reference frame. This is a dead end. End of story, I'm afraid. If you want to understand why exactly, you should have a look at my previous comment. But this video did a pretty good job of explaining what's going on.
You mean x(n+1) = 2*x(n) - x(n-1) + a(n)*dt^2? The a(n)*dt^2 term is the result of double integration: v = Int a dt; x = Int v dt. Notice how the factor after a(n) is dt^2, not just dt.
Correct, but the dt^2 can be computed only once since it is assumed to be constant. Much more simple and faster computation speed. You can just compute it as dt^2=dt*dt
I guess I did not clearly spell out my point. The dt^2 indicates that Verlet integration is, in fact, double integration. Whether dt^2 can be pre-computed or not is irrelevant. The main reason for drift is that the acceleration function a(t) is affected by sensor noise (and orientation mis-estimation, but let's ignore that for a second). If you integrate only once, as when getting orientation from angular velocity measured by the gyroscopes, sensor noise becomes a random walk, which means drift accumulates linearly with time (twice the time span, twice the drift). Acceleration, on the other hand, is integrated twice (doesn't matter if via Verlet integration or Euler integration or Runge-Kutta integration), which means noise first becomes a random walk, and then an integrated random walk, which accumulates quadratically with time, i.e., twice the time span, four times the drift. That's the issue.
yeah, i just saw that today. im curious why i haven't found videos of other people using your software for that. found a video doing similar also today at the channel Convrge Team. exciting stuff! :D
blaxalb I don't currently have good tutorials on how to set up a good multi-site capture and communication system. That's probably the main reason why there aren't any takers. Convrge are doing very similar things to me, but I'm not sure if they rolled their own software, or are using mine for the backend. Looks exactly like mine, but that doesn't mean anything.
okreylos makes sense. btw, i found the convrge video through a comment on a recent reddit post by rogeressig reposting your 3 kinect video. the comment was saying they're planning to release a download in 2 months so seems like they're working on a version, don't kno how much they may have gotten it from you though :)
You wrote in your description: >>>>>>>>>When tracking position based on accelerometer data using dead reckoning, drift accumulates quadratically, meaning that the speed of drift increases proportionally to time passed"
"Well, an accelerometer doesn't drift" And that's not what I said. What I said is that the difference between actual position and estimated position grows quadratically over time, due to double integration of a noisy signal. "Did you apply an offset in order to compensate for drift?" But you said accelerometers don't drift? I know what you mean. Yes, of course there is offset compensation, established during an initial calibration phase described here: doc-ok.org/?p=639 "filtering the accelerometer-values with a low-pass filter might help" That is not an appropriate approach for position dead reckoning. "Are you aware that tilting the platform will distort the accelerometer reading (because of the earth's grafity)." Yes, that is exactly the main reason why this does not work.
Thx for your answer. However, accelerometer-data are in most cases filtered inside the IMU. Usually such filters are programmable. Could you somehow fix the problem shown in your video? I wonder how the distortion of the accelerometer-reading caused by the earth's gravity could be removed. Is this caused by the misalignment of the accelerometer-die due to manufacturing tolerances? I have read that very expensive (MEMS)-IMU's are much better suited for dead reckoning (price around $3000). This one might just work perfect for dead reckoning: www.analog.com/media/en/technical-documentation/data-sheets/ADIS16490.pdf They are called: Tactical Grade IMU
The problem is that accelerometers measure gravity in local IMU space, so in order to remove gravity, you need to know the IMU's instantaneous orientation. But IMU orientation itself is only an estimate, reconstructed from gyroscope measurements and drift-corrected by the accelerometers. I am curious how much difference there would be between the IMUs used in current VR devices, and "tactical-grade" IMUs. The latter might buy you some more time before drift gets too large, but they don't fix the fundamental issue.
So I have a question. It sounds like pure IMU-based positional tracking does work, but the rapid drift makes it unusable. Couldn't you compensate for the drift by adding an extremely basic, low-CPU-load Computer Vision based tracker? Just the bare-minimum quality to undo the drift? John Carmack seems to be convinced that inside-out tracking is an extremely difficult problem that will take years to figure out, if it's ever figured out. But maybe he's trying to do it purely with Computer Vision? Maybe IMU + very basic CV to correct drift is the right solution?
That's exactly how Vive's Lighthouse and Oculus' Constellation, and Microsoft Hololens' inside-out tracking system work: IMU with dead reckoning for low-latency pose estimation, and an optical system for drift control. Getting a vision-based tracker to the accuracy and robustness required for VR tracking is very difficult, even when using IMU tracking to help.
I did combine the Kinect v1 with some old Augmented Reality Glasses (Vuzix 920AR) and got nice results, you can check the video on my channel :) (just keep in mind that the IMU in the glasses got damaged before doing this experiment, so I had to use the IMU in my phone and send its data to my laptop over wifi, in the previous demo the glasses IMU was working but I wasn't using the Kinect)
Turns out it's less double integration and more gravity that's at fault. See this related video: ua-cam.com/video/-nsylEpgVek/v-deo.html If you did it in free fall or outer space, it wouldn't be too bad.
Hi okreylos, Are u interested in working as a freelancer in an IMU positioning project? We struggle from this issue and looking for experienced candidates to solve the issue.
+boggers I don't believe it until there's a physical implementation, shown to work. The main source of drift is not random accelerometer error, but mis-estimation of gravity due to quick rotations or gyro drift. I'd guess that the guy's multi-IMU simulation does not take that into account, and therefore does not reflect reality.
+okreylos Thanks very much for your input. (I'm that guy btw) You're right, I definitely need to build it to see if the error from real sensors can be squashed as easily, I expect it will be harder, but possible. Talking to some people now about building a prototype.
+okreylos Why do you say that the main source of drift is not random accelerometer error? if it were so, it shouldn't happen if you simply put the rift on a plane and move it along that, but this happens even if the headset is staying put. Or do those giro errors happen randomly even when stable and get bunched up in the.... uhh, Kalman Filter, i guess?? Also, it just got me thinking, if it were about truly random accel errors, they should be uniformly distributed thus making errors cancel out on a long enought time frame.... or at least not compound to infinity. It seems that is not happening. Anyways, trickier than it looks. Thanks for the video.
Vlad Radulescu If accelerator error were normally distributed zero-mean error, it would not cancel out. Accelerometer values are (doubly) integrated over time. Integrating normal random noise doesn't result in zero, but in a random walk, which can attain any value given enough time. Integrate the random walk one more time, and you get constant-velocity drift. But the main source of error is gravity. Sensor orientation is also only an estimate, but it has to be used to cancel out the constant acceleration from gravity. A small mis-estimate of orientation will result in cancellation failure, which results in spurious accelerations in the horizontal and vertical. Those will lead to increasing-velocity drift.
+okreylos what i found when simulating random error from multiple sources (via softwate only) was averaging results only slightly slowed the drift acceleration. however deliberately choosing the result with the most deceleration / least acceleration becomes rock solid with around 5 inputs. I suspect if sensors were mounted at different angles, it would be good enough for mqny mobile tracking purposes. I believe apple has already acquired an extremely similar patent though, so i stopped pursuing the idea.
So this guy seems to make it work... ua-cam.com/video/ymuhJ6pt52o/v-deo.html but i don't know if also this gonna drift after a certain time... also i am not sure if there is may be a good reason why he is doing this on the roof (gps, altometer, etc.). So would this also work indoor???
short answer is Yes, if you have the money long answer is Unless you work for government or are Elon musk, you do not have the money to but sensors that have extremely insanely low drift . so you'll endup using cheap IMUs which have huge drifts!
Kalman filtering doesnt help? Also as you can see in the following video, you are actually not right, you can use IMU for positioning: ua-cam.com/video/6ijArKE8vKU/v-deo.html
+Игровой канал RobosergTV Kalman filtering will probably reduce the amount of drift, but not get rid of the problem to a satisfying degree. Regarding the video you linked: IMUs can be used for positional tracking in some constrained circumstances. In the video, the trick is that the system can detect (via impact) when one of the user's feet hits the ground, and can then assume that that foot does not move until it lifts off again. This allows the integrator to reset the foot's velocity to zero, which eliminates accumulated drift. The system in the video still drifts, but only during the (short) time that one of the user's feet is in the air. This approach does not work for generalized tracking, and specifically not for head tracking.
+RobosergTV ➤ Игровой канал The nice thing about ped tracking is that you have a "stop" point within a fraction of a second. So just as when okreylos was resetting his demo's position every few seconds to show a "move to the left" or a "move to the right", a step can be treated the same way. I think of each step as being relatively linked to each preceding step. Linking all the steps together gives you a much more accurate view of the overall process but this would not be possible if it were not for step detection. Ironically enough I had an unrelated conversation this morning with the author of that video were he backs up okreylos` point that an accelerometer odometer not currently possible. You can see the discussion in comments here: electronics.stackexchange.com/questions/156192/accelerometer-double-integration-error?noredirect=1#comment511923_156192
The video you linked says that assumptions are made regarding gait. This could be the distance traveled by the foot during a single gait which means that there will probably be error in the position value.
Would adding another IMU separates by a certain distance help? Since there are two frames of reference, there should be more information to work with. Or is it still impossible?
I may sound completely silly when I ask this, but how come you can't have it assume that close to zero movement EQUALS zero movement? As in, configure a bit of a dead zone, like < 1cm per second = 0m/s. That would never work for good head tracking in VR but it would stop this drift surely.
Ultimately it remains unfit for VR but it is better than nothing at all, isn't it?
The basic property required of positional tracking is that if you start at point X, then move away to point Y, and back to point X, the tracker knows that you're back at the original point.
Now say you implement a dead zone of 1cm/s. You start at X, move quickly one foot to the right, and then slowly back to the left, at less than 1cm/s. So on the return trip your motion is ignored by the dead zone, and even though you're back at X, the tracker thinks you're still one foot to the right. That's the fundamental problem.
This man has been planning Half Life Alyx at 304mph for 6 years now.
@@samb7291 prepare for unforseen consequences... well... he did.
You can run a band pass filter with the high frequency noise being cut and the very low frequency drift being cut also. Working with high quality imu specifically for position has worked well in the pass and from my experience a good khalman filter works great along with band pass filters
I had been wondering about this for a while, thanks for answering the question! I didn't realise it would drift so quickly :/
Aircraft and submarines have used precise IMUs for decades. The old ones used mechanical gyroscopes and the new ones use ring laser gyros. How can it work for aircraft, but not for this?
The positional drift of a high-quality inertial navigation system as used in ships or aircraft is on the order of 0.6 nautical miles per hour -- according to Wikipedia, so make of that what you will. But 0.6 nm/h is about one foot per second, approximately, and pretty similar to what we're seeing here.
okreylos Good point. .6nm/hr isn't too bad for an aircraft, but for something sitting in front of a TV it wouldn't work. Aircraft can "realign" their INU in flight using GPS or ground based radio navigation aids which is the equivalent of what you are doing with external 3d reference.
Fascinating stuff man, keep up the good work.
Did you try subtracting from the mean, to compensate for noise + gravity??
I can see that it's actually working really great. Maybe it just needs some good DSP.
There is also the possibility of training a model to compensate for the drift, by training it using an external reference, like a camera. Which is exactly what these guys have done and got really exiting results -
ua-cam.com/video/IroLp5VOPDE/v-deo.html
(I've also been trying to do this for a long time)
I have a couple questions here as I notice the drift error appears to follow a specific linear trajectory.
First, did you ensure the accelerometer was calibrated before you began using it? These units will not always be calibrated in manufacture, that is the physical alignment of the sensor axis' at the lowest level both relative to each other (orthogonal) and the housing (gravitational vector laying perfectly on the - z axis when the unit is sitting on a level surface). As without having done this you would expect a constant source of error that is not the result of noise yes?
Also, how have you set up the position calculation, does it integrate the total historical signal or is it iterative over a very small recent window? In any noisy system windup will occur if all of the past samples are being integrated twice. You will still have windup error but it should be minimized relative to the sample window size?
I know in theory dead reckoning alone with IMU will always have errors without an external reference, but as you say you spent little on this. Could not the results be considerably improved with some effort?
I am sitting down to start implementing this myself currently. One idea I am toying with is to set a trigger level near the noise threshold so that zero velocity is maintained at readings that are near zero. This would introduce discrete errors at start and stops or for extremely slow and maintained motion. But overall an improvement and likely suitable for many applications? Furthermore, retroactive extrapolation may also be used to 'fill in' the threshold dead zone using the velocity upon crossing the threshold boundary...
Also, another idea I have... to create a frame of reference that is based on the assumption that the device never leaves a particular work space. For example a user sitting a chair. Any arbitrary 'global' reference point could be set and the entire derived trajectory could be transformed to never stray to far from the origin.
Somewhat of a crude KF process model making use of some known assumptions about the application similar to using foot falls during walking to set zeroed points. For an application using a VR headset that involves the users sitting, you at least know a user's butt will never move. Therefore the entire trajectory should lay within some general area. Any detected drifting towards boundaries could be evaluated and perhaps the magnitude could be distributed and subtracted across the whole historical signal? Sort of a 'counter drift' to drive back towards zero.
Approaches like this could never work on a boat or a plane and and to a large degree a car (the is no environmental current and the wheels can not traverse laterally in this case), but I would think in many human applications, human behavior may to some degree be predictable, varying from application to application of course.
Also, as another person mentioned, the use of two IMUs where the fixed distance between them is known and be used to compensate between units. I would think that should be a defacto standard in the construction of any device like a VR head set, they are small and cheap enough to warrant the extra hardware given the potential improvements in application?
Thanks for the demonstration and explanation!
But isn't the problem still "just" a practical problem of increasing precision? Obviously a threshhold value for velocity, maybe filtering out movements that pass along a straight line (~= drift) you could remove this drift problem. Then the only problem remaining would be precision. If you increase precision / update speed fast enough (maybe by a factor of 10, maybe by just combining more IMU chips in different orientations) then it could be usable. You would still get the same errors of course but they could be small enough to not affect the experience.
If you have 'n' amount of IMUs (n being something large like 10 or 20), would their drift almost cancel out or will they drift in the same direction, I have a feeling they would cancel out as this is random error and not systematic error.
I don't think so. I've wrote a code with a sampling rate of 50 every 10 milliseconds and it still drifts off.
@@danteregianifreitas6461 50 seperate IMUs or the same one 50 times every 10 ms? The drift moment to moment of a specific IMU seems to be consistent-ish, but it does seem plausible that many separate IMUs could average out to a useable level.
Well, even then, with more IMUs, and the relative positions of them, you could maybe try some other math tricks, given that you have more information.
Could you go more into depth about how you created the orientation tracker, or link me to some resources? I understand that you integrate the gyroscope readings and combine them with the down vector from the accelerometer and the readings from the magnetometer, but did you use a specific algorithm such as the Madgwick filter or the Kalman filter? If not do you know where I could find resources on the specific formula you used? I'm trying to implement a similar tracker with my own IMU. I know this video is a bit old, but I hope you still see this
The orientation tracking algorithm in my software has been changing over the years, and I don't remember exactly what it was when I recorded this ad-hoc video.
The current version is an improved Madgwick filter (I simplified the way he integrated magnetometer readings), but I think the video is still using the original version. Either way is a good starting point. Orientation-only tracking or positional dead reckoning (which doesn't work, as shown here) does not need the full power of the Kalman filter. That only comes into play when you want to fuse IMU predictions and corrections from an absolute 3-DOF or 6-DOF tracking system: ua-cam.com/video/-nsylEpgVek/v-deo.html
@@okreylos Ah okay. That linked video was super informative, thank you so much. So I have been trying to implement a madgwick filter and it's approximately correct, but I get a large amount of drift, like a full rotation leads to about 20 degrees of heading drift. I am using a cheap arduino that comes with an IMU sensor (LSM9DS1), do you think there is an error in my implementation, or could it be that I need a better sensor? Thanks so much
@@ColeLashley I don't know the specs for that sensor, but 20° error per full rotation is too much not to be a bug.
Thank you for this clear, brilliant explanation.
had been wondering about this, thanks for the explanation
The noise seems to be accumulating in one direction. If you could compensate for it, maybe it could work. You can calibrate the compensation by measuring the noise at the static position.
The drift looks like a constant acceleration as in gravity, and the real motion like a variable acceleration. Like gravity the two motions seem to just add together.
*****
Think of it compounding reverb
I want to make an indoor positioning program for an autonomous robot(on a 2d map as if you were seeing it from above).Do you think it'll have the same kinds of problems?
I want to develop a motion capture system with IMU I want to know which algorithm is better for sensor fussion AHRS or EKF? One more thing I also want to know the softwares which r required for displaying the result on pc or laptop is there any specifc software. or which software you used for displaying the results?
Thank you so much
Awesome results man, thanks for saving my time and many people's time too. Some times we need to proof that things do not work rather than proof that they do, thumbs up.
Abubaker Mahmoud u full o shit
Is it possible to correct the drift from the IMU by setting an external magnetic sensor as the reference point (zero) of the magnetometer in the IMU. For example, tracking the position of the wrist, by having the magnetic field next to the magnetometer of the IMU, the zero or reference point will always stay at the position of the wrist.
Excuse my ignorance but would this not work if you wrote a starting threshold value that is higher than the drift value? So the drift is below the threshold value and therefore would not register as a movement but when really moving your head you would be above the threshold and therefore the movement would be registered?
It would work, but one of the reasons for positional tracking is to prevent motion sickness, and therefore tracking small movements like swaying back and forth slightly is key. So it wouldn't be practical unless you had a high motion sickness tolerance.
Due to integration, very small accelerations add up to large displacements over time. If you threshold away the small ones, you create large offsets in position after a while.
***** mhm..even your 5 senses have those same issues
***** This is noise that is inherent to integrating any variable to obtain a result value. Because you are not using fixed coordinates to compare (i.e. Fixed Camera) you are only measuring ACCELERATION in any given direction. So ACCELERATION needs to be integrated twice to get POSITION and hence the conundrum.
Edit: It needs to be noted that there are no available possible means as of today or dare I say in the future, to directly measure POSITION or VELOCITY without reference frames. You can blame Einstein for that. lol.
I think you are talking about "measurement noise" which is also inherent to any measurement. This noise however is not the problem we are facing here. They have that down to a pretty good state by using higher frequency sampling and interpolation.
A few things, the rotation doesnt drift because you can always tell down with gravity and the magnetometer helps too, as for linear drift you always seem to drift to the right indicating that something isnt calibrated properly. When properly calibrated you drift in random directions
Could be simple roundoff error, which would drift in a consistent direction, by random amounts.
If you have a phone camera that can track positions but slowly and/or imprecisely, can this be used to make it smoother?
Yes, see this video for how and why that would work: ua-cam.com/video/-nsylEpgVek/v-deo.html
@@okreylos he used the GPS :|
How come one of the videos on Madgewick's channel using his sensor fusion algorithm shows positional tracking using just one IMU? Am I missing something?
Yes, you are missing something. But without knowing to which video you are referring, I can't tell you what that is.
Okay I'm late to the party on a 1 year old comment that's comparing this 5 year old video with a mention of a 10 year old video... but:
* SebMadgwick is assuming that the footsteps he's recording are moving in a uniform distance. So the graphs are probably incorrect.
* Also, he probably has drift in his sensors as well, but the drift is probably negligible in the visualization or calibrated out.
* It looks to be that the footsteps are assumed to be forward in the Y.
Basically he created a pedometer that has a gyro attached for direction... as far as I can tell. It's still impressive but it won't be accurate enough for real world.
What about two IMU Sensors, is there a possibility to meassure the distance between those two Sensors? That could be usefull for Motion Tracking. If you could get the position of all IMUs refered to one IMU (for example one at the foot) you would still have a drift but the motion would be correct.
Only if the two IMUs are connected by a rigid body, or a linkage of rigid bodies with one IMU per link. That way you can deduce spatial relationships from the IMUs' orientations. That's why IMU-based skeletal tracking can estimate the spatial relationships of limbs, but not the position of the entire body relative to the ground.
okreylos So, are you saying that if you had a headset with an accelerometer and gyroscope on either side, you could get the amount the user moves right, but the results would only be relative to where they were when the program started reading the data?
***** No, that doesn't work. What I mean is that if you place an IMU on your upper arm, and one on your lower arm, then you can track the position of your wrist relative to your shoulder.
okreylos Ah... Thanks for clearing that up.
Do you get any reduction in the drift if you combine the measurements of more than one IMU connected by a rigid body, or would there be just as many jitters getting combined as there would be canceling out on average? If that does reduce drift, how many would you need to combine to get less than a millimeter of drift within 1 hour?
I used Runge-Kutta and it doesn't drift. it only moves a little bit sometimes but comes back again. however i have problems with roll/pitch/yaw, but it might just be my problem because i'm pretty novice. I used MPU-9250 breakout board from sparkfun.
Hello, why did you choose Runge-Kutta, isn't that for solving nonlinear equations?
@@hocuspocus1126 6 years later I have no idea what is runge-kutta and why I chose it
looks like an oldie so just curious if you fixed it. or still looking.
for the latter, my 2 cents:
the position can not be determined with accelerometer because the uniform motion can not be detected.
a submarine has means to determine the speed therefore they can blindly calculate the position change from last known; they mostly go straight.
in your case, you have no means to detect constant speed; you can only detect acceleration for which you may be able to calculate distance traveled, but when acceleration level downs to noise you cant tell the drifting.
best would be to know that a gaming headset doesnt move away in the room and only calculate the attitude using the accelerometers, forget drift.
You need to somehow get the device to recognize some CONTROL POINTS in 3d space to LOCK or constrain to work within.
I don't know anything about anything, but couldn't a built-in compass deliver a fixed-position vector (even if it happens to point towards some overpowering stationary magnetic object than the geographic pole) which the others could update and calibrate against?
I wonder if you can use 2 IMU's to correct the position tracking of each other
Thank You Very much for this video. I think I have a firm grasp of the problem now.
Alas the conclusion is, "problem not solvable without an external reference frame".
Bummer :(
There goes the stand alone VR glasses dream. You can always integrate cameras to the Rift itself to detect external pointers, but that's very impractical at the moment.
And a decade later, its becoming yhe true answer to everything
Can you help with project required 2 IMU 9 axis
What if you added information from just one camera (most people have a webcam) tracking the user, could you get enough information for this to be useful. Or do you need a better method for this to be useful at all?
Yes, single-camera tracking can be done. If you do camera tracking, merging in data from the IMU allows you to reduce the total latency of the tracker. It's what Oculus are doing with the DK2.
okreylos
Great video, the guys at AntVR should have watched that before claiming full positional tracking :) I just posted it on MTBS3D and I understood AntVR are alredy shipping. Let's see what they deliver...
I'd say Oculus should put a camera on the ceiling to have full 360 degree tracking, or at least have that as an option. The current implementation is only good for "Wheelchair VR" :)
konstantin lozev I made this video in response to AntVR's claims (which are of course completely bogus). What they call "one-step positional tracking," I call "virtual emetics."
This is a excellent analysis. Have you thought about non-obvious ways to overcome this? Example: Is there a "virtual" external reference frame that could be used?
so i understand that absolute position tracking is impossible with imu. can we use pedestrians walking for sensor orientation ??
@okreylos What kind of sensor fusion algorithms are you using to obtain orientation? Or is that mostly done by the Oculus Rift's SDK?
+Jason Leung This isn't using Oculus' SDK at all. It's using a sensor fusion algorithm I made up. I've recently done a lot more research into this, and it turns out that my algorithm is on the same level as Madgwick's well-known fusion algorithm quality-wise, but requires about twice as many ALU operations per iteration. I've ditched it in favor of an improved version of Madgwick's algorithm since I made this video.
How to get xyz position from 9DOF IMU output?
What if we can correct the drift error after sometime
If you have an external system that can measure the IMU's absolute position in space at regular intervals, then you can correct drift. That's precisely how HTC Vive's and Oculus Rift's tracking systems work, using lasers and cameras, respectively.
Helpful illustration, Thanks! :)
So can you do a video explaining Positional tracking and the use of the Oculus Quest? Do you still think this video stands?
I kinda did: ua-cam.com/video/-nsylEpgVek/v-deo.html
Cool vid. Is there an approximate scale for the arrows on the screen?
But what sensor is use for htc vive controllers like tanis ....
Hi, I'm wondering if you can share your source code for pure imu-based positional tracking, though it will drift? Thanks in advance.
Hey, I have a question that I figured I would ask you if you are available, since you gave me such a great answer to another question that I had.
So I have developed a 3DOF positional tracking algorithm for my project using the Madgwick filter, and it works pretty nicely, with very little drift, but I have an issue where the positions of things in my 3D simulation depends on the position of the imu when the algorithm starts running, and I want to make it so that everything is in the same place every time, so that things will always be in the same direction, ie things in the virtual world will always line up with the same places in the real world.
I'm pretty new to 3D math and I'm not super comfortable with using quaternions so I convert the quaternion values into pitch yaw and roll before passing them from my sensor into my 3D simulation and applying them to my camera. I've heard that pitch yaw and roll is generally expressed as rate of change rather than an absolute orientation and I thought that maybe if I used the quaternion directly rather than converting to euler angles I could get absolute orientation that wouldn't depend on the starting position, but after doing so my position tracking is super off. I think there's probably an error somewhere in terms of unit conversion or the coordinate planes being backwards or something like that that I can fix by doing some research about quaternions and 3D rotation conventions, but I'm not sure if I'm on the right track and that using quaternions instead of euler angles will accomplish what I want.
Do you have any tips, suggestions or resources on how to accomplish this? Thanks!
Gyro and accelorometer are going to give you, your'e positition in the universe, but the earth is spining and is rotating around the sun, the sun spin in galaxy and the galaxy move around (even if we dont feel it we are moving very fast in all kinds of directions). If you dont compansate for those displacements of course the readings will drift. Did you try to place an IMU on a fix reference location (like your desk) and substract its apparent drift from your moving IMU?
How many samples would be needed to get an accurate reading to compensate for drift?
On an ideal gyro the problem is not the gyro drifting but the earth drifting in angle and location in the universe while the gyro reference remain fix. This can be compansate either by computing your absolut position in the universe, pretty easy for the earth rotation on itself and around the sun more difficult for the position in the milky way and the milky way in the universe or by having a reference gyro at a fix location and substracting its apparent "drift" to your moving gyro. But I'll agree with you, the cheap gyros are far from ideal.
So, even with sensor fusion using Accelerometer, gyroscope, and magnetometer, it still wouldn't be enough?
What if we wanted to track and iPhone position and we were using the camera sensor. Would tracking pixels do any good and use even more sensor fusion to compensate for drift and get accurate displacement from a set origin?
Are the movements in around the sun even that significant? You have a rotation rate of ~1°/day and the acceleration shouldn't be even measurable I think. The next bigger movements (eg. in milky way) should be even less significant. For most applications it should suffice to consider the rotation of the earth itself. Everything else is less than the drift of even the best of sensors today, or at least for these MEMS-sensors. Also, you can simulate these sensors with every noise effects in a perfectly stationary environment and it will still drift.
Great video, thanks for the info.
It's 2021 I'm wondering whether if IMU-based positional tracking is still a no-go!!?
Yes.
Get a decent imu and add either a khalman filter or band pass filter and it is just fine
What integration technique did you use?
+Nathan Imig I tried Euler and Verlet integration. Both gave very similar results. The fast drifting problem is not due to integration error build-up, but due to errors in estimating the direction of gravity based on noisy sensor readings.
+okreylos so idk how ur ref frame is stable but ur gravity vector is noisy, that shouldn't be. i assume ur using euler angles which are filtered from fusion to get orientation of gravity vector. but then u used raw acceleration to take out the gravity vector?
+okreylos If ur not filtering the accelerometer somehow before removing gravity your going to just reintroduce the noise again (this was the same noise that was taken away w/ euler angles by fusing the angle's from accel, and angle's from gyro). If you could isolate the noise from accel it may be possible to run thru a type of low pass filter to remove more noise.
+okreylos oh and im not sure you really can say the integration technique isnt introducing a reasonable amount of error. it will always introduce error especially on impulsive movements.
+okreylos if it were true noise, it would not always be lagging the distances... i.e. draw a circular path and see if it always spirals inward; always spirals outward; both inward/outward spirals; or path is circular (this be the easiest method).
how to get position from roll pitch yaw, i use gy951 ahrs
You'll need to integrade linear acceleration twice to get position.
Thank you Oskar for this great video. Even thought I'm disappointed, you saved me lot of hours hassling around with IMU :). I would like to track multiple drones cheap. How comes these guys have such a good results? 3D Tracking with IMU
Since their IMU is foot-mounted, they can detect the impact when the foot hits the ground, and reset all velocities to zero at that point. That way drift can't build up like it does here. They only need to semi-reliably track displacement during the gait phase when the foot is off the ground, and that's pretty short during sustained walking.
I see, that's smart. Thanks for the advice!
What about this video? It only uses IMU with no external references. Same people btw:
ua-cam.com/video/SI1w9uaBw6Q/v-deo.html
This is so informative, thanks!
Also check this one: ua-cam.com/video/-nsylEpgVek/v-deo.html , it explains the theory and how practical IMU-based tracking works.
@@okreylos Excellent, thanks for sharing.
So the problem is noise combined with numerical inaccuracies?
This is not a problem that can be solved, not without an external reference frame. This is a dead end. End of story, I'm afraid.
If you want to understand why exactly, you should have a look at my previous comment. But this video did a pretty good job of explaining what's going on.
can you send me the code that you used for the demo ?
What's the sensor do u use? What's the method?
So...
What about the Perception Neuron?
Thecoolestnerdguy Read my article on PrioVR (doc-ok.org/?p=1003), Perception Neuron works the same way.
Oh, thanks! ;)
Cound u tell us how you code this 😁
Try Verlet Integration. You never need to do double integration
You mean x(n+1) = 2*x(n) - x(n-1) + a(n)*dt^2?
The a(n)*dt^2 term is the result of double integration: v = Int a dt; x = Int v dt. Notice how the factor after a(n) is dt^2, not just dt.
Correct, but the dt^2 can be computed only once since it is assumed to be constant. Much more simple and faster computation speed. You can just compute it as dt^2=dt*dt
I guess I did not clearly spell out my point. The dt^2 indicates that Verlet integration is, in fact, double integration. Whether dt^2 can be pre-computed or not is irrelevant. The main reason for drift is that the acceleration function a(t) is affected by sensor noise (and orientation mis-estimation, but let's ignore that for a second).
If you integrate only once, as when getting orientation from angular velocity measured by the gyroscopes, sensor noise becomes a random walk, which means drift accumulates linearly with time (twice the time span, twice the drift).
Acceleration, on the other hand, is integrated twice (doesn't matter if via Verlet integration or Euler integration or Runge-Kutta integration), which means noise first becomes a random walk, and then an integrated random walk, which accumulates quadratically with time, i.e., twice the time span, four times the drift. That's the issue.
okreylos , when are you gonna make a multiplayer version of your three kinects video?
blaxalb Yeah, I got delayed. But there is a glimpse of the multi-person system in the recent video published by The Verge, t.co/URcEP8llIv
yeah, i just saw that today. im curious why i haven't found videos of other people using your software for that. found a video doing similar also today at the channel Convrge Team. exciting stuff! :D
blaxalb I don't currently have good tutorials on how to set up a good multi-site capture and communication system. That's probably the main reason why there aren't any takers. Convrge are doing very similar things to me, but I'm not sure if they rolled their own software, or are using mine for the backend. Looks exactly like mine, but that doesn't mean anything.
okreylos makes sense. btw, i found the convrge video through a comment on a recent reddit post by rogeressig reposting your 3 kinect video. the comment was saying they're planning to release a download in 2 months so seems like they're working on a version, don't kno how much they may have gotten it from you though :)
is this still not possible?
Do you provide source code?
Whatever you say. Imma make a game in unity that uses the BARE MINIMUM controls.
now can antvr with what seems like fake positional tracking just go away....unless im wrong it just seems link a broken bootleg.
You wrote in your description: >>>>>>>>>When tracking position based on accelerometer data using dead reckoning,
drift accumulates quadratically, meaning that the speed of drift
increases proportionally to time passed"
"Well, an accelerometer doesn't drift"
And that's not what I said. What I said is that the difference between actual position and estimated position grows quadratically over time, due to double integration of a noisy signal.
"Did you apply an offset in order to compensate for drift?"
But you said accelerometers don't drift? I know what you mean. Yes, of course there is offset compensation, established during an initial calibration phase described here: doc-ok.org/?p=639
"filtering the accelerometer-values with a low-pass filter might help"
That is not an appropriate approach for position dead reckoning.
"Are you aware that tilting the platform will distort the accelerometer reading (because of the earth's grafity)."
Yes, that is exactly the main reason why this does not work.
Thx for your answer. However, accelerometer-data are in most cases filtered inside the IMU. Usually such filters are programmable.
Could you somehow fix the problem shown in your video?
I wonder how the distortion of the accelerometer-reading caused by the earth's gravity could be removed. Is this caused by the misalignment of the accelerometer-die due to manufacturing tolerances?
I have read that very expensive (MEMS)-IMU's are much better suited for dead reckoning (price around $3000). This one might just work perfect for dead reckoning: www.analog.com/media/en/technical-documentation/data-sheets/ADIS16490.pdf They are called: Tactical Grade IMU
The problem is that accelerometers measure gravity in local IMU space, so in order to remove gravity, you need to know the IMU's instantaneous orientation. But IMU orientation itself is only an estimate, reconstructed from gyroscope measurements and drift-corrected by the accelerometers.
I am curious how much difference there would be between the IMUs used in current VR devices, and "tactical-grade" IMUs. The latter might buy you some more time before drift gets too large, but they don't fix the fundamental issue.
So I have a question. It sounds like pure IMU-based positional tracking does work, but the rapid drift makes it unusable. Couldn't you compensate for the drift by adding an extremely basic, low-CPU-load Computer Vision based tracker? Just the bare-minimum quality to undo the drift? John Carmack seems to be convinced that inside-out tracking is an extremely difficult problem that will take years to figure out, if it's ever figured out. But maybe he's trying to do it purely with Computer Vision? Maybe IMU + very basic CV to correct drift is the right solution?
That's exactly how Vive's Lighthouse and Oculus' Constellation, and Microsoft Hololens' inside-out tracking system work: IMU with dead reckoning for low-latency pose estimation, and an optical system for drift control.
Getting a vision-based tracker to the accuracy and robustness required for VR tracking is very difficult, even when using IMU tracking to help.
If you combined Oculus Rift (or even just a regular HMD) with Xbox One's Kinect 2 (or Kinect For Windows v2) it would be beyond awesome!
I did combine the Kinect v1 with some old Augmented Reality Glasses (Vuzix 920AR) and got nice results, you can check the video on my channel :) (just keep in mind that the IMU in the glasses got damaged before doing this experiment, so I had to use the IMU in my phone and send its data to my laptop over wifi, in the previous demo the glasses IMU was working but I wasn't using the Kinect)
Double integration will not work unfortunately. I also found out the hard way by spending a lot of time on it.
Turns out it's less double integration and more gravity that's at fault. See this related video: ua-cam.com/video/-nsylEpgVek/v-deo.html If you did it in free fall or outer space, it wouldn't be too bad.
its drifting harder than my drift while playing osu
there we go
Hi okreylos, Are u interested in working as a freelancer in an IMU positioning project? We struggle from this issue and looking for experienced candidates to solve the issue.
Please email me at kreylos@cs.ucdavis.edu
@@okreylos Check your email.
It's a lot better nowandays with more modern units
so antvr stop thinking you can do Positional Tracking
@okreylos would love to hear your thoughts on a multiple IMU solution: www.mtbs3d.com/phpBB/viewtopic.php?f=120&t=21592
+boggers I don't believe it until there's a physical implementation, shown to work. The main source of drift is not random accelerometer error, but mis-estimation of gravity due to quick rotations or gyro drift. I'd guess that the guy's multi-IMU simulation does not take that into account, and therefore does not reflect reality.
+okreylos Thanks very much for your input. (I'm that guy btw) You're right, I definitely need to build it to see if the error from real sensors can be squashed as easily, I expect it will be harder, but possible. Talking to some people now about building a prototype.
+okreylos Why do you say that the main source of drift is not random accelerometer error? if it were so, it shouldn't happen if you simply put the rift on a plane and move it along that, but this happens even if the headset is staying put. Or do those giro errors happen randomly even when stable and get bunched up in the.... uhh, Kalman Filter, i guess??
Also, it just got me thinking, if it were about truly random accel errors, they should be uniformly distributed thus making errors cancel out on a long enought time frame.... or at least not compound to infinity. It seems that is not happening. Anyways, trickier than it looks. Thanks for the video.
Vlad Radulescu If accelerator error were normally distributed zero-mean error, it would not cancel out. Accelerometer values are (doubly) integrated over time. Integrating normal random noise doesn't result in zero, but in a random walk, which can attain any value given enough time. Integrate the random walk one more time, and you get constant-velocity drift.
But the main source of error is gravity. Sensor orientation is also only an estimate, but it has to be used to cancel out the constant acceleration from gravity. A small mis-estimate of orientation will result in cancellation failure, which results in spurious accelerations in the horizontal and vertical. Those will lead to increasing-velocity drift.
+okreylos what i found when simulating random error from multiple sources (via softwate only) was averaging results only slightly slowed the drift acceleration. however deliberately choosing the result with the most deceleration / least acceleration becomes rock solid with around 5 inputs. I suspect if sensors were mounted at different angles, it would be good enough for mqny mobile tracking purposes. I believe apple has already acquired an extremely similar patent though, so i stopped pursuing the idea.
So this guy seems to make it work...
ua-cam.com/video/ymuhJ6pt52o/v-deo.html
but i don't know if also this gonna drift after a certain time... also i am not sure if there is may be a good reason why he is doing this on the roof (gps, altometer, etc.). So would this also work indoor???
The title of the video says that the unit is using GPS so there is a position value that can be used as a base for compensating drift.
0:29 0:30
short answer is Yes, if you have the money
long answer is Unless you work for government or are Elon musk, you do not have the money to but sensors that have extremely insanely low drift .
so you'll endup using cheap IMUs which have huge drifts!
Kalman filtering doesnt help? Also as you can see in the following video, you are actually not right, you can use IMU for positioning:
ua-cam.com/video/6ijArKE8vKU/v-deo.html
+Игровой канал RobosergTV Kalman filtering will probably reduce the amount of drift, but not get rid of the problem to a satisfying degree.
Regarding the video you linked: IMUs can be used for positional tracking in some constrained circumstances. In the video, the trick is that the system can detect (via impact) when one of the user's feet hits the ground, and can then assume that that foot does not move until it lifts off again. This allows the integrator to reset the foot's velocity to zero, which eliminates accumulated drift.
The system in the video still drifts, but only during the (short) time that one of the user's feet is in the air.
This approach does not work for generalized tracking, and specifically not for head tracking.
+RobosergTV ➤ Игровой канал The nice thing about ped tracking is that you have a "stop" point within a fraction of a second. So just as when okreylos was resetting his demo's position every few seconds to show a "move to the left" or a "move to the right", a step can be treated the same way. I think of each step as being relatively linked to each preceding step. Linking all the steps together gives you a much more accurate view of the overall process but this would not be possible if it were not for step detection. Ironically enough I had an unrelated conversation this morning with the author of that video were he backs up okreylos` point that an accelerometer odometer not currently possible. You can see the discussion in comments here: electronics.stackexchange.com/questions/156192/accelerometer-double-integration-error?noredirect=1#comment511923_156192
The video you linked says that assumptions are made regarding gait. This could be the distance traveled by the foot during a single gait which means that there will probably be error in the position value.
How would you Kalman filter an acceleration-value?
same as you filter everything else with a Kalman Filter. Also you are usually interested in speed and position
You said "no you can't" but all I heard is "we'll get there"
I'm the 1000th like, yay!
Edit: Nvm someone disliked so I'm the 999th like lol ;p
Would adding another IMU separates by a certain distance help? Since there are two frames of reference, there should be more information to work with. Or is it still impossible?
i didnt think its gonna work. two drifting frames of reference will result in drifting result (just my guess).