Abstract | ||
---|---|---|
A unique virtual reality platform for multisensory integration studies is presented. It allows to provide multimodal sensory stimuli (i.e. auditory, visual, tactile, etc.) ensuring temporal coherence, key factor in cross-modal integration. Four infrared cameras allow to real-time track the human motion and correspondingly control a virtual avatar. A user-friendly interface allows to manipulate a great variety of features (i.e. stimulus type, duration and distance from the participants' body, as well as avatar gender, height, arm pose, perspective, etc.) and to real-time provide quantitative measures of all the parameters. The platform has been validated on two healthy participants testing a reaction time task which combines tactile and visual stimuli, for the investigation of peripersonal space. Results proved the effectiveness of the proposed platform, showing a significant correlation (p=0.013) between the participant's hand distance from the visual stimulus and the reaction time to the tactile stimulus. More participants will be recruited to further investigate the other measures provided by the platform. |
Year | DOI | Venue |
---|---|---|
2020 | 10.1109/EMBC44109.2020.9176387 | 42ND ANNUAL INTERNATIONAL CONFERENCES OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY: ENABLING INNOVATIVE TECHNOLOGIES FOR GLOBAL HEALTHCARE EMBC'20 |
DocType | Volume | ISSN |
Conference | 2020 | 1557-170X |
Citations | PageRank | References |
0 | 0.34 | 0 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
A Noccaro | 1 | 0 | 0.34 |
M Pinardi | 2 | 0 | 0.34 |
Domenico Formica | 3 | 88 | 26.60 |
Giovanni Di Pino | 4 | 25 | 7.95 |