Title
Flycon: real-time environment-independent multi-view human pose estimation with aerial vehicles.
Abstract
We propose a real-time method for the infrastructure-free estimation of articulated human motion. The approach leverages a swarm of camera-equipped flying robots and jointly optimizes the swarm's and skeletal states, which include the 3D joint positions and a set of bones. Our method allows to track the motion of human subjects, for example an athlete, over long time horizons and long distances, in challenging settings and at large scale, where fixed infrastructure approaches are not applicable. The proposed algorithm uses active infra-red markers, runs in real-time and accurately estimates robot and human pose parameters online without the need for accurately calibrated or stationary mounted cameras. Our method i) estimates a global coordinate frame for the MAV swarm, ii) jointly optimizes the human pose and relative camera positions, and iii) estimates the length of the human bones. The entire swarm is then controlled via a model predictive controller to maximize visibility of the subject from multiple viewpoints even under fast motion such as jumping or jogging. We demonstrate our method in a number of difficult scenarios including capture of long locomotion sequences at the scale of a triplex gym, in non-planar terrain, while climbing and in outdoor scenarios.
Year
DOI
Venue
2018
10.1145/3272127.3275022
ACM Trans. Graph.
Keywords
Field
DocType
human pose estimation, robotics
Computer vision,Control theory,Visibility,Swarm behaviour,Computer science,Terrain,Pose,Artificial intelligence,Robot,Climbing,Robotics
Journal
Volume
Issue
ISSN
37
6
0730-0301
Citations 
PageRank 
References 
2
0.37
32
Authors
5
Name
Order
Citations
PageRank
Nageli, T.1815.20
Samuel Oberholzer220.37
Silvan Plüss320.37
Javier Alonso-Mora437534.15
Otmar Hilliges53075140.20