Title
User-Adaptive Editing for 360 degree Video Streaming with Deep Reinforcement Learning
Abstract
The development through streaming of 360\degree\ videos is persistently hindered by how much bandwidth they require. Adapting spatially the quality of the sphere to the user's Field of View (FoV) lowers the data rate but requires to keep the playback buffer small, to predict the user's motion or to make replacements to keep the buffered qualities up to date with the moving FoV, all three being uncertain and risky. We have previously shown that opportunistically regaining control on the FoV with active attention-driving techniques makes for additional levers to ease streaming and improve Quality of Experience (QoE). Deep neural networks have been recently shown to achieve best performance for video streaming adaptation and head motion prediction. This demo presents a step ahead in the important investigation of deep neural network approaches to obtain user-adaptive and network-adaptive 360 degree video streaming systems. In this demo, we show how snap-changes, an attention-driving technique, can be automatically modulated by the user's motion to improve the streaming QoE. The control of snap-changes is made with a deep neural network trained on head motion traces with the Deep Reinforcement Learning strategy A3C.
Year
DOI
Venue
2019
10.1145/3343031.3350601
Proceedings of the 27th ACM International Conference on Multimedia
Keywords
Field
DocType
360 degree video streaming, deep reinforcement learning, film editing, motion prediction, recurrent neural networks, user attention
Field of view,Computer vision,Computer science,Video streaming,Recurrent neural network,Bandwidth (signal processing),Artificial intelligence,Quality of experience,Artificial neural network,Film editing,Reinforcement learning
Conference
ISBN
Citations 
PageRank 
978-1-4503-6889-6
1
0.35
References 
Authors
0
4
Name
Order
Citations
PageRank
Lucile Sassatelli19112.87
Marco Winckler221.06
Thomas Fisichella311.03
Ramon Aparicio410.35