Abstract | ||
---|---|---|
Programming by Demonstration (PbD) lets users with little technical background program a wide variety of manipulation tasks for robots, but it should be as intuitive as possible for users while requiring as little time as possible. In this paper, we present a Programming by Demonstration system that synthesizes manipulation programs from a single observed demonstration, allowing users to program new tasks for a robot simply by performing the task once themselves. A human-in-the-loop interface helps users make corrections to the perceptual state as needed. We introduce Object Interaction Programs as a representation of multi-object, bimanual manipulation tasks and present algorithms for extracting programs from observed demonstrations and transferring programs to a robot to perform the task in a new scene. We demonstrate the expressivity and generalizability of our approach through an evaluation on a benchmark of complex tasks. |
Year | DOI | Venue |
---|---|---|
2019 | 10.1109/IROS40897.2019.8968543 | 2019 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS) |
Field | DocType | ISSN |
Generalizability theory,Programming by demonstration,Computer vision,Computer science,Human–computer interaction,Artificial intelligence,Robot,Perception,Expressivity | Conference | 2153-0858 |
Citations | PageRank | References |
0 | 0.34 | 0 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Justin Huang | 1 | 0 | 0.34 |
Dieter Fox | 2 | 12306 | 1289.74 |
Maya Cakmak | 3 | 882 | 58.40 |