Title
A Hybrid Deep Learning Model for Human Activity Recognition Using Multimodal Body Sensing Data
Abstract
Human activity recognition from multimodal body sensor data has proven to be an effective approach for the care of elderly or physically impaired people in a smart healthcare environment. However, traditional machine learning techniques are mostly focused on a single sensing modality, which is not practical for robust healthcare applications. Therefore, recently increasing attention is being given by the researchers on the development of robust machine learning techniques that can exploit multimodal body sensor data and provide important decision making in Smart healthcare. In this paper, we propose an effective multi-sensors-based framework for human activity recognition using a hybrid deep learning model, which combines the simple recurrent units (SRUs) with the gated recurrent units (GRUs) of neural networks. We use the deep SRUs to process the sequences of multimodal input data by using the capability of their internal memory states. Moreover, we use the deep GRUs to store and learn how much of the past information is passed to the future state for solving fluctuations or instability in accuracy and vanishing gradient problems. The system has been compared against the conventional approaches on a publicly available standard dataset. The experimental results show that the proposed approach outperforms the available state-of-the-art methods.
Year
DOI
Venue
2019
10.1109/ACCESS.2019.2927134
IEEE ACCESS
Keywords
DocType
Volume
Multi-modal body sensor data,activity recognition,deep recurrent neural networks (RNNs),simple recurrent unit (SRU),gated recurrent unit (GRU),robust healthcare
Journal
7
ISSN
Citations 
PageRank 
2169-3536
4
0.39
References 
Authors
0
4
Name
Order
Citations
PageRank
Gumaei, A.15310.73
Mohammad Mehedi Hassan2132094.80
Abdulhameed Al-elaiwi363147.05
Hussain Alsalman440.39