Abstract | ||
---|---|---|
In this paper, we propose a novel method for recognizing environment based on relationship between human actions and objects for a mobile robot. Most of previous works on environment recognition for robots focused on generating obstacle maps for path-planning. In addition, model-based object recognition techniques are also used for searching particular objects. It is, however, difficult in reality to prepare a lot of models in advance for recognizing various objects in unknown environments. On the other hand, human can often recognize objects not from their appearances but by watching other person taking actions on them. This is because the function and/or the usage of the objects are closely related with human actions. We have introduced conceptual models of human actions and objects for classifying objects by observing human activities in our previous work. In this paper, we apply this key idea to a mobile robot. We also demonstrate that the arrangement of objects can be recognized by analyzing human actions. |
Year | DOI | Venue |
---|---|---|
2006 | 10.1109/ICPR.2006.496 | ICPR (4) |
Keywords | Field | DocType |
model-based object recognition technique,human action,environment recognition,conceptual model,unknown environment,mobile robot,human activity,human actions,key idea,previous work,classifying object,object recognition,conceptual models,path planning,mobile robots,image classification | Motion planning,Obstacle,Computer vision,Robot vision,Conceptual model,Computer science,Artificial intelligence,Contextual image classification,Robot,Mobile robot,Cognitive neuroscience of visual object recognition | Conference |
ISSN | ISBN | Citations |
1051-4651 | 0-7695-2521-0 | 2 |
PageRank | References | Authors |
0.42 | 9 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Masakatsu Mitani | 1 | 2 | 0.76 |
Mamoru Takaya | 2 | 2 | 0.76 |
Atsuhiro Kojima | 3 | 178 | 16.61 |
Kunio Fukunaga | 4 | 206 | 28.55 |