Abstract | ||
---|---|---|
A self-localization system for autonomous mobile robots is presented. This system estimates the robot position in previously learned environments, using data provided solely by an omnidirectional visual perception subsystem composed of a camera and of a special conical reflecting surface. It performs an optical pre-processing of the environment, allowing a compact representation of the collected data. These data are then fed to a learning subsystem that associates the perceived image to an estimate of the actual robot position. Both neural networks and statistical methods have been tested and compared as learning subsystems. The system has been implemented and tested and results are presented. |
Year | DOI | Venue |
---|---|---|
2001 | 10.1016/S0921-8890(00)00103-2 | Robotics and Autonomous Systems |
Keywords | Field | DocType |
Self-localization,Omnidirectional sensor,Visual navigation,Mobile robots | Robot learning,Omnidirectional antenna,Robot control,Computer vision,Computer science,Simulation,Artificial intelligence,Mobile robot navigation,Artificial neural network,Robot,Visual perception,Mobile robot | Journal |
Volume | Issue | ISSN |
34 | 1 | 0921-8890 |
Citations | PageRank | References |
11 | 0.79 | 9 |
Authors | ||
2 |
Name | Order | Citations | PageRank |
---|---|---|---|
A. Rizzi | 1 | 605 | 64.43 |
Riccardo Cassinis | 2 | 21 | 4.20 |