Title
Fast multi-view people localization using a torso-high reference plane
Abstract
People locations bring rich information for a wide spectrum of applications in intelligent video surveillance systems. In addition to localization accuracy, computational efficiency is another significant issue to be highly concerned in people localization. As an essential early stage, people localization has to be accomplished in a very short time, enabling further semantic analysis. However, most state-of-the-art people localization methods pay little attention to computational efficiency. Hence, in this paper we propose an effective and efficient multi-view people localization scheme with several acceleration mechanisms. First, a torso-high reference plane is introduced since in general the torso part (after foreground segmentation) is more intact and stable than the other parts of a human body, and thus can predict potential people locations more reliably. Then, a novel and computationally efficient bitwise-operation scheme is proposed to predict people locations at the intersection regions of foreground line samples from multiple views. After rule-based validation, people locations can be accurately obtained and visualized on a real world plane. Experiments on multi-view surveillance videos not only validate the high accuracy of the proposed method in locating people under crowded scenes with serious occlusions, but also demonstrate an outstanding computational speed.
Year
DOI
Venue
2017
10.1109/VCIP.2017.8305032
2017 IEEE Visual Communications and Image Processing (VCIP)
Keywords
Field
DocType
Surveillance video analysis,people localization,camera calibration,multiple cameras,real-time system
Computer vision,Torso,Computer science,Segmentation,Real-time operating system,Camera resectioning,Acceleration,Artificial intelligence
Conference
ISBN
Citations 
PageRank 
978-1-5386-0463-2
0
0.34
References 
Authors
7
4
Name
Order
Citations
PageRank
Chun-Chieh Hsu1165.22
Hua-Tsung Chen228928.72
Wen-Jiin Tsai317419.57
Suh-Yin Lee41596319.67