Title
A Fog Computing Model for Implementing Motion Guide to Visually Impaired
Abstract
A guide dog robot system for visually impaired often needs to process many kinds of information, such as image, voice and other sensor information. Information processing methods based on deep neural network can achieve better results. However, it requires expensive computing and communication resources to meet the real-time requirement. Fog computing has emerged as a promising solution for applications that are data-intensive and delay-sensitive. We propose a fog computing framework named PEN (Phone + Embedded board + Neural compute stick) for the guide dog robot system. The robot’s functions in PEN are wrapped as services and deployed on the appropriate devices. Services are combined as an application in a visual programming language environment. Neural compute stick accelerates image processing speed at low power consumption. A simulation environment and a prototype are built on the framework. The simulated guide dog system is developed for operating in a miniature environment, including a small robot dog, a small wheelchair, model cars, traffic lights, and traffic blockage. The prototype is a full-sized portable guide system that can be used by a visually impaired person in a real environment. Simulation and experiments show that the framework can meet the functional and performance requirements for implementing the guide systems for visually impaired.
Year
DOI
Venue
2020
10.1016/j.simpat.2019.102015
Simulation Modelling Practice and Theory
Keywords
Field
DocType
Guide dog robot,Visually impaired,Fog computing,Neural compute stick,Embedded system
Wheelchair,Information processing,Computer science,Image processing,Fog computing,Visual programming language,Real-time computing,Artificial neural network,Robot,Power consumption
Journal
Volume
ISSN
Citations 
101
1569-190X
0
PageRank 
References 
Authors
0.34
0
5
Name
Order
Citations
PageRank
Jinhui Zhu110.69
Jie Hu200.34
Mei Zhang300.34
Yinong Chen41361127.89
Sheng Bi500.34