Abstract | ||
---|---|---|
Collaborators are required to share a broad range of information in collaborative work. In physical spatial collaboration such as team dancing, theatrical performance, and team sport, one's behavior should be in accord with the partner's; the relative position in distance and in direction is critically important all the time even when moving. To realize remote physical spatial collaboration, achieving the shared recognition of positional relationship is one of the key issues. In this paper, we propose a setup between two remote sites where each collaborator immerses in the other remote site by a telepresence robot situated in the other site. The telepresence robot is a wheeled platform with a camera attached at man's eye height. It moves in response to the remote user's position and orientation and enables the user to get the first-person view video through a head-mounted display. By using two telepresence robots from both sites, both collaborators can recognize the partner being in front of him/her and can grasp the distance and the orientation of the partner in remote site. In the proposed setup, users can grasp the positional relationship as accurate and quick as the existing fixed views including the bird's-eye view. |
Year | DOI | Venue |
---|---|---|
2018 | 10.1109/CSCWD.2018.8465391 | 2018 IEEE 22nd International Conference on Computer Supported Cooperative Work in Design ((CSCWD)) |
Keywords | Field | DocType |
remote collaboration,spatial collaboration,physical collaboration,telepresence robot | Situated,GRASP,Task analysis,Computer science,Robot kinematics,Human–computer interaction,Robot,Telerobotics,Distributed computing | Conference |
ISBN | Citations | PageRank |
978-1-5386-1483-9 | 0 | 0.34 |
References | Authors | |
4 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Naoki Katayama | 1 | 1 | 1.40 |
Tomoo Inoue | 2 | 1 | 7.79 |
Hiroshi Shigeno | 3 | 163 | 50.88 |