Title
Egocentric-View Fingertip Detection For Air Writing Based On Convolutional Neural Networks
Abstract
This research investigated real-time fingertip detection in frames captured from the increasingly popular wearable device, smart glasses. The egocentric-view fingertip detection and character recognition can be used to create a novel way of inputting texts. We first employed Unity3D to build a synthetic dataset with pointing gestures from the first-person perspective. The obvious benefits of using synthetic data are that they eliminate the need for time-consuming and error-prone manual labeling and they provide a large and high-quality dataset for a wide range of purposes. Following that, a modified Mask Regional Convolutional Neural Network (Mask R-CNN) is proposed, consisting of a region-based CNN for finger detection and a three-layer CNN for fingertip location. The process can be completed in 25 ms per frame for 640 x 480 RGB images, with an average error of 8.3 pixels. The speed is high enough to enable real-time "air-writing", where users are able to write characters in the air to input texts or commands while wearing smart glasses. The characters can be recognized by a ResNet-based CNN from the fingertip trajectories. Experimental results demonstrate the feasibility of this novel methodology.
Year
DOI
Venue
2021
10.3390/s21134382
SENSORS
Keywords
DocType
Volume
air-writing, fingertip detection, region-based convolutional neural network, smart glasses
Journal
21
Issue
ISSN
Citations 
13
1424-8220
0
PageRank 
References 
Authors
0.34
0
5
Name
Order
Citations
PageRank
Yung-Han Chen100.34
Chi-Hsuan Huang222.06
Sin-Wun Syu300.34
Tien-Ying Kuo414819.24
Po-Chyi Su500.34