TriboTouch: Micro-Patterned Surfaces for Low Latency Touchscreens

Поділитися
Вставка
  • Опубліковано 26 кві 2022
  • Touchscreen tracking latency, often 80ms or more, creates a rubber-banding effect in everyday direct manipulation tasks such as dragging, scrolling, and drawing. This has been shown to decrease system preference, user performance, and overall realism of these interfaces. In this research, we demonstrate how the addition of a thin, 2D micro-patterned surface with 5 micron spaced features can be used to reduce motor-visual touchscreen latency. When a finger, stylus, or tangible is translated across this textured surface frictional forces induce acoustic vibrations which naturally encode sliding velocity. This acoustic signal is sampled at 192kHz using a conventional audio interface pipeline with an average latency of 28ms. When fused with conventional low-speed, but high-spatial-accuracy 2D touch position data, our machine learning model can make accurate predictions of real-time touch location. Published at CHI 2022.
    Research Team: Craig Shultz, Daewha Kim, Karan Ahuja and Chris Harrison
    Citation
    Shultz C., Kim D., Ahuja K. and Harrison, C. 2022. TriboTouch: Micro-Patterned Surfaces for Low Latency Touchscreens. To appear in Proceedings of the 40th Annual SIGCHI Conference on Human Factors in Computing Systems (April 30 - May 6, 2022). CHI '22. ACM, New York, NY.
  • Наука та технологія

КОМЕНТАРІ • 11

  • @amslu
    @amslu 2 роки тому +2

    Cool tech demo, such projects should always include a slide on 10 reasons why it can’t be commercialised or adopted by mass market.. 😛 still waiting on my 3d Haptic Touch surface displays!

  • @Excalibur32
    @Excalibur32 Рік тому +1

    Awesome!

  • @maniacalcactus4705
    @maniacalcactus4705 2 роки тому +2

    Very impressive. Would be interested to see if it functions with small touch inputs such as styluses

    • @FiglabCMU
      @FiglabCMU  2 роки тому +1

      Yes, it does. You can see the signal of a stylus at around 0:36. Although we didn't discuss this in the published work, it is possible to recognize what type of input is being used (in other words, distinguish finger from tangible from stylus, etc.)

  • @timng9104
    @timng9104 2 роки тому

    hi, great stuff! i love how acoustics can be applied here. is this just projection based on velocity? can u elaborate on the machine learning part? how do you acquire those training datasets? hehe thanks in advance!

    • @FiglabCMU
      @FiglabCMU  2 роки тому +1

      The acoustic channel gives us really good velocity estimates at 750 FPS, as well as some angle information. You can read the full technical details in the paper linked here: www.figlab.com/research/2022/tribotouch

  • @thewonderfultartiflette4733
    @thewonderfultartiflette4733 2 роки тому +1

    Hello ! I love it ! But I have a question : When someone swipes their finger on their screen and traditional tracking methods lag behind TriboTouch, okay the user will see what TriboTouch estimated, but when traditional tracking has new "real" coordinates, does TriboTouch will correct itself ?

  • @AndrewMeyer
    @AndrewMeyer 2 роки тому +2

    Interesting. I wonder if it works with multitouch...

    • @larperdoodle
      @larperdoodle 2 роки тому +1

      It looks like the regular touch input is separate from the acoustic input, so at the very least, you should be able to disable this advanced touch sensing if there are multiple contact points detected.

  • @robiimadot.9273
    @robiimadot.9273 Рік тому

    Sounds good for osu droid

  • @ERKNEES2
    @ERKNEES2 22 дні тому

    Witchcraft!!!