Title
Prediction Of Head Movement In 360-Degree Videos Using Attention Model
Abstract
In this paper, we propose a prediction algorithm, the combination of Long Short-Term Memory (LSTM) and attention model, based on machine learning models to predict the vision coordinates when watching 360-degree videos in a Virtual Reality (VR) or Augmented Reality (AR) system. Predicting the vision coordinates while video streaming is important when the network condition is degraded. However, the traditional prediction models such as Moving Average (MA) and Autoregression Moving Average (ARMA) are linear so they cannot consider the nonlinear relationship. Therefore, machine learning models based on deep learning are recently used for nonlinear predictions. We use the Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU) neural network methods, originated in Recurrent Neural Networks (RNN), and predict the head position in the 360-degree videos. Therefore, we adopt the attention model to LSTM to make more accurate results. We also compare the performance of the proposed model with the other machine learning models such as Multi-Layer Perceptron (MLP) and RNN using the root mean squared error (RMSE) of predicted and real coordinates. We demonstrate that our model can predict the vision coordinates more accurately than the other models in various videos.
Year
DOI
Venue
2021
10.3390/s21113678
SENSORS
Keywords
DocType
Volume
LSTM, GRU, head movement, time-series prediction, machine learning, attention model
Journal
21
Issue
ISSN
Citations 
11
1424-8220
0
PageRank 
References 
Authors
0.34
0
3
Name
Order
Citations
PageRank
Dongwon Lee12407190.05
Minji Choi200.34
Joohyun Lee342.79