Title
An Adaptive Fusion Architecture for Target Tracking
Abstract
A vision system is demonstrated that adaptively allocates computational resources over multiple cues to robustly track a target in 3D. The system uses a particle filter to maintain multiple hypotheses of the target location. Bayesian probability theory provides the framework for sensor fusion, and resource scheduling is used to intelli-gently allocate the limited computational resources available across the suite of cues. The system is shown to track a person in 3D space moving in a cluttered environment.
Year
DOI
Venue
2002
10.1109/AFGR.2002.1004164
FGR
Keywords
Field
DocType
Bayes methods,computer vision,image motion analysis,probability,resource allocation,scheduling,sensor fusion,tracking,3D target tracking,Bayesian probability theory,adaptive fusion architecture,cluttered environment,computational resource allocation,computer vision system,multiple cues,particle filter,person tracking,resource scheduling,sensor fusion
Resource management,Computer vision,Machine vision,Suite,Scheduling (computing),Computer science,Particle filter,Sensor fusion,Resource allocation,Artificial intelligence,Bayesian probability
Conference
ISBN
Citations 
PageRank 
0-7695-1602-5
33
4.89
References 
Authors
6
4
Name
Order
Citations
PageRank
Gareth Loy162742.88
Luke Fletcher234032.95
Nicholas Apostoloff314313.69
Alexander Zelinsky41144124.18