Title
Mise-Unseen: Using Eye Tracking to Hide Virtual Reality Scene Changes in Plain Sight
Abstract
Creating or arranging objects at runtime is needed in many virtual reality applications, but such changes are noticed when they occur inside the user's field of view. We present Mise-Unseen, a software system that applies such scene changes covertly inside the user's field of view. Mise-Unseen leverages gaze tracking to create models of user attention, intention, and spatial memory to determine if and when to inject a change. We present seven applications of Mise-Unseen to unnoticeably modify the scene within view (i) to hide that task difficulty is adapted to the user, (ii) to adapt the experience to the user's preferences, (iii) to time the use of low fidelity effects, (iv) to detect user choice for passive haptics even when lacking physical props, (v) to sustain physical locomotion despite a lack of physical space, (vi) to reduce motion sickness during virtual locomotion, and (vii) to verify user understanding during story progression. We evaluated Mise-Unseen and our applications in a user study with 15 participants and find that while gaze data indeed supports obfuscating changes inside the field of view, a change is rendered unnoticeably by using gaze in combination with common masking techniques.
Year
DOI
Venue
2019
10.1145/3332165.3347919
Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology
Keywords
Field
DocType
change blindness, eye-tracking, inattentional blindness, staging, virtual reality
Virtual reality,Inattentional blindness,Computer science,Sight,Eye tracking,Human–computer interaction,Change blindness
Conference
ISBN
Citations 
PageRank 
978-1-4503-6816-2
1
0.36
References 
Authors
0
5
Name
Order
Citations
PageRank
Sebastian Marwecki1242.79
Andrew D. Wilson25065362.19
Eyal Ofek31865106.07
Mar Gonzalez-Franco416920.04
Christian Holz587856.58