Abstract | ||
---|---|---|
Daily human activities, e.g., locomotion, exercises, and resting, are heavily guided by the tactile interactions between the human and the ground. In this work, leveraging such tactile interactions, we propose a 3D human pose estimation approach using the pressure maps recorded by a tactile carpet as input. We build a low-cost, high-density, large-scale intelligent carpet, which enables the real-time recordings of human-floor tactile interactions in a seamless manner. We collect a synchronized tactile and visual dataset on various human activities. Employing a state-ofthe-art camera-based pose estimation model as supervision, we design and implement a deep neural network model to infer 3D human poses using only the tactile information. Our pipeline can be further scaled up to multi-person pose estimation. We evaluate our system and demonstrate its potential applications in diverse fields. |
Year | DOI | Venue |
---|---|---|
2021 | 10.1109/CVPR46437.2021.01110 | 2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021 |
DocType | ISSN | Citations |
Conference | 1063-6919 | 1 |
PageRank | References | Authors |
0.35 | 27 | 8 |
Name | Order | Citations | PageRank |
---|---|---|---|
Yiyue Luo | 1 | 1 | 1.70 |
Yunzhu Li | 2 | 60 | 7.93 |
Michael Foshey | 3 | 8 | 3.20 |
Wan Shou | 4 | 1 | 0.69 |
Pratyusha Sharma | 5 | 1 | 0.35 |
Tomas Palacios | 6 | 1 | 2.71 |
Antonio Torralba | 7 | 14607 | 956.27 |
Wojciech Matusik | 8 | 4771 | 254.42 |