Title
Human Gaze-Driven Spatial Tasking Of An Autonomous Mav
Abstract
In this letter, we address the problem of providing human-assisted quadrotor navigation using a set of eye tracking glasses. The advent of these devices (i.e., eye tracking glasses, virtual reality tools, etc.) provides the opportunity to create new, non-invasive forms of interaction between humans and robots. We show how a set of glasses equipped with gaze tracker, a camera, and an inertial measurement unit (IMU) can be used to estimate the relative position of the human with respect to a quadrotor, and decouple the gaze direction from the head orientation, which allows the human to spatially task (i.e., send new 3-D navigation waypoints to) the robot in an uninstrumented environment. We decouple the gaze direction from head motion by tracking the human's head orientation using a combination of camera and IMU data. In order to detect the flying robot, we train and use a deep neural network. We experimentally evaluate the proposed approach, and show that our pipeline has the potential to enable gaze-driven autonomy for spatial tasking. The proposed approach can be employed in multiple scenarios including inspection and first response, as well as by people with disabilities that affect their mobility.
Year
DOI
Venue
2019
10.1109/LRA.2019.2895419
IEEE ROBOTICS AND AUTOMATION LETTERS
Field
DocType
Volume
Computer vision,Virtual reality,Task analysis,Gaze,Control engineering,Eye tracking,Artificial intelligence,Drone,Inertial measurement unit,Engineering,Robot,Artificial neural network
Journal
4
Issue
ISSN
Citations 
2
2377-3766
3
PageRank 
References 
Authors
0.44
0
4
Name
Order
Citations
PageRank
Liangzhe Yuan1191.96
Christopher Reardon2739.46
Garrett A. Warnell37816.40
Giuseppe Loianno414413.94