Title
Jazzy: Leveraging Virtual Reality Layers for Hand-Eye Coordination in Users with Amblyopia.
Abstract
This paper describes the design and development of Jazzy, a Virtual Reality (VR) application for users with Amblyopia. Jazzy has been designed in collaboration with a target end user of the system from its early stages. Jazzy exploits visual layers to display different contents for each eye on the Head Mounted Display (HMD) screens. In this way, the system becomes a controllable tool to stimulate eyes individually. In addition, taking advantage of the HMD associated controllers, the system is able to track the user in the physical space, enhancing the perceived realism during the VR experience. Users can train hand-eye coordination skills in a more lifelike and engaging manner. Jazzy also provides an interface for caregivers empowering them with a new support tool that can be used alongside classic therapeutic artifacts. In this paper we describe the eye-specific parametric visual stimuli and the caregiver's interface to tune them remotely for plug and play activities that can be experienced at home.This paper describes the design and development of Jazzy, a Virtual Reality (VR) application for users with Amblyopia. Jazzy has been designed in collaboration with a target end user of the system from its early stages. Jazzy exploits visual layers to display different contents for each eye on the Head Mounted Display (HMD) screens. In this way, the system becomes a controllable tool to stimulate eyes individually. In addition, taking advantage of the HMD associated controllers, the system is able to track the user in the physical space, enhancing the perceived realism during the VR experience. Users can train hand-eye coordination skills in a more lifelike and engaging manner. Jazzy also provides an interface for caregivers empowering them with a new support tool that can be used alongside classic therapeutic artifacts. In this paper we describe the eye-specific parametric visual stimuli and the caregiver's interface to tune them remotely for plug and play activities that can be experienced at home.
Year
Venue
Field
2018
CHI Extended Abstracts
Virtual reality,Eye–hand coordination,End user,Computer science,Exploit,Plug and play,Human–computer interaction,Optical head-mounted display,Physical space,Multimedia,Visual perception
DocType
ISBN
Citations 
Conference
978-1-4503-5621-3
0
PageRank 
References 
Authors
0.34
0
4
Name
Order
Citations
PageRank
Mario Scrocca111.73
Nicola Ruaro200.34
Daniele Occhiuto3132.73
F. Garzotto414613.68