Title
VIMSNet: an effective network for visually induced motion sickness detection
Abstract
Visually induced motion sickness (VIMS) often occurs when people are exposed to virtual reality (VR) environments. It is necessary to develop a method that can detect the VR user’s VIMS level effectively according to physiological signals. In this study, subjects were visually stimulated by a VR vehicle-driving simulator (VDS) to induce VIMS, and an EEG device with only four electrodes was used to collect EEG data. The data were simply preprocessed and then inputted into our well-designed classifier (VIMSNet). Two important modules are presented in VIMSNet for high-performance classification: a parallel-feature extraction (PFE) module is designed to extract features with different scales from EEG data, and a feature attention (FA) module is designed to emphasize useful features while suppressing less useful ones. Experimental results show that our proposed VIMSNet outperforms the state-of-the-art (SOTA) classifiers in various classification tasks: for per-subject binary classification, the proposed classifier achieved an average accuracy and kappa of 0.967 and 0.930, respectively; for multiple-subject binary classification, it yielded an accuracy of 0.953 and a kappa of 0.903; for multiple-subject four-level classification, the accuracy and kappa were 0.925 and 0.874, respectively. Our study also indicates that it is feasible to accurately detect VIMS using an EEG device with a small number of electrodes. Source code is available at https://github.com/threedteam/VIMSNet.
Year
DOI
Venue
2022
10.1007/s11760-022-02164-9
Signal, Image and Video Processing
Keywords
DocType
Volume
Virtual reality, EEG, Visually induced motion sickness (VIMS), Classification, VIMS level
Journal
16
Issue
ISSN
Citations 
8
1863-1703
0
PageRank 
References 
Authors
0.34
9
6
Name
Order
Citations
PageRank
Ran Liu161.58
Cui Shanshan200.34
Zhao Yang300.34
Xi Chen433370.76
Yi Lin500.34
Hwang Alex D.600.34