TriboTouch: Micro-Patterned Surfaces for Low Latency Touchscreens
Вставка
- Опубліковано 26 кві 2022
- Touchscreen tracking latency, often 80ms or more, creates a rubber-banding effect in everyday direct manipulation tasks such as dragging, scrolling, and drawing. This has been shown to decrease system preference, user performance, and overall realism of these interfaces. In this research, we demonstrate how the addition of a thin, 2D micro-patterned surface with 5 micron spaced features can be used to reduce motor-visual touchscreen latency. When a finger, stylus, or tangible is translated across this textured surface frictional forces induce acoustic vibrations which naturally encode sliding velocity. This acoustic signal is sampled at 192kHz using a conventional audio interface pipeline with an average latency of 28ms. When fused with conventional low-speed, but high-spatial-accuracy 2D touch position data, our machine learning model can make accurate predictions of real-time touch location. Published at CHI 2022.
Research Team: Craig Shultz, Daewha Kim, Karan Ahuja and Chris Harrison
Citation
Shultz C., Kim D., Ahuja K. and Harrison, C. 2022. TriboTouch: Micro-Patterned Surfaces for Low Latency Touchscreens. To appear in Proceedings of the 40th Annual SIGCHI Conference on Human Factors in Computing Systems (April 30 - May 6, 2022). CHI '22. ACM, New York, NY. - Наука та технологія
Cool tech demo, such projects should always include a slide on 10 reasons why it can’t be commercialised or adopted by mass market.. 😛 still waiting on my 3d Haptic Touch surface displays!
Awesome!
Very impressive. Would be interested to see if it functions with small touch inputs such as styluses
Yes, it does. You can see the signal of a stylus at around 0:36. Although we didn't discuss this in the published work, it is possible to recognize what type of input is being used (in other words, distinguish finger from tangible from stylus, etc.)
hi, great stuff! i love how acoustics can be applied here. is this just projection based on velocity? can u elaborate on the machine learning part? how do you acquire those training datasets? hehe thanks in advance!
The acoustic channel gives us really good velocity estimates at 750 FPS, as well as some angle information. You can read the full technical details in the paper linked here: www.figlab.com/research/2022/tribotouch
Hello ! I love it ! But I have a question : When someone swipes their finger on their screen and traditional tracking methods lag behind TriboTouch, okay the user will see what TriboTouch estimated, but when traditional tracking has new "real" coordinates, does TriboTouch will correct itself ?
Interesting. I wonder if it works with multitouch...
It looks like the regular touch input is separate from the acoustic input, so at the very least, you should be able to disable this advanced touch sensing if there are multiple contact points detected.
Sounds good for osu droid
Witchcraft!!!