Abstract | ||
---|---|---|
Unmanned Aerial Vehicles (UAVs) are playing an increasing role in gathering information about objects on the ground. In particular, a key problem is to detect and classify objects from a sequence of camera images. However, existing systems typically adopt an idealised model of sensor observations, by assuming they are independent, and take the form of maximum likelihood predictions of an object's class. In contrast, real vision systems produce output that can be highly correlated and corrupted by noise. Therefore, traditional approaches can lead to inaccurate or overconfident results, which in turn lead to poor decisions about what to observe next to improve these predictions. To address these issues, we develop a Gaussian Process based observation model that characterises the correlation between classifier outputs as a function of UAV position. We then use this to fuse classifier observations from a sequence of images and to plan the UAV's movements. In both real and simulated target search scenarios, we show that this can achieve a decrease in mean squared detection error of up to 66% relative to existing state-of-the-art methods. |
Year | DOI | Venue |
---|---|---|
2015 | 10.5555/2772879.2773356 | Autonomous Agents and Multi-Agent Systems |
Keywords | Field | DocType |
Active Sensing, Target Search, Unmanned Aerial Vehicles, Gaussian Processes | Computer vision,Square (algebra),Computer science,Maximum likelihood,Vision based,Correlation,Artificial intelligence,Gaussian process,Classifier (linguistics),Fuse (electrical),Human–robot interaction | Conference |
Citations | PageRank | References |
0 | 0.34 | 17 |
Authors | ||
5 |
Name | Order | Citations | PageRank |
---|---|---|---|
W.T. Luke Teacy | 1 | 588 | 28.88 |
Simon Justin Julier | 2 | 38 | 10.49 |
Renzo De Nardi | 3 | 189 | 13.87 |
alex rogers | 4 | 2500 | 183.76 |
Nicholas R. Jennings | 5 | 19348 | 1564.35 |