Retargeted Self-Haptics for Increased Immersion in VR without Hand Instrumentation
Вставка
- Опубліковано 26 вер 2024
- Future Interfaces Group: figlab.com
Cathy Fang: cathy-fang.com
Chris Harrison: chrisharrison.net
Today’s consumer virtual reality (VR) systems offer immersive graphics and audio, but haptic feedback is rudimentary - delivered through controllers with vibration feedback or is non-existent (i.e., the hands operating freely in the air). In this paper, we explore an alternative, highly mobile and controller-free approach to haptics, where VR applications utilize the user’s own body to provide physical feedback. To achieve this, we warp (retarget) the locations of a user’s hands such that one hand serves as a physical surface or prop for the other hand. For example, a hand holding a virtual nail can serve as a physical backstop for a hand that is virtually hammering, providing a sense of impact in an air-borne and uninstrumented experience. To illustrate this rich design space, we implemented twelve interactive demos across three haptic categories. We conclude with a user study from which we draw design recommendations.
Future Interfaces Group, Carnegie Mellon University
Fang, C. and Harrison, C. 2021. Retargeted Self-Haptics for Increased Immersion in VR without Hand Instrumentation. In Proceedings of the 34th Annual ACM Symposium on User Interface Software and Technology (October 10 - 13, 2021). UIST '21. ACM, New York, NY.
This is insanely smart
Misaligned hands in VR cause severe motion sickness and sense of dissociation in my experience. Definitely not worth the added "immersion". Users are much better off using controllers with advanced haptics like tension-triggers and precise haptic vibration motors.
notted
When they implement loading shotgun shells, it will look a bit sus :)
This is clever
Do u use proximity sensor for that?