Abstract | ||
---|---|---|
Over the last decade, a body of research investigated enriching touch actions by using finger orientation as an additional input. Beyond new interaction techniques, we envision new user interface elements to make use of the additional input information. We define the fingers orientation by the pitch, roll, and yaw on the touch surface. Determining the finger orientation is not possible using current state-of-the-art devices. As a first step, we built a system that can determine the finger orientation. We developed a working prototype with a depth camera mounted on a tablet. We conducted a study with 12 participants to record ground truth data for the index, middle, ring and little finger to evaluate the accuracy of our prototype using the PointPose [3] algorithm to estimate the pitch and yaw of the finger. By applying 2D linear correction models, we further show a reduction of RMSE by 45.4% for pitch and 21.83% for yaw. |
Year | DOI | Venue |
---|---|---|
2017 | 10.1145/3098279.3122125 | PROCEEDINGS OF THE 19TH INTERNATIONAL CONFERENCE ON HUMAN-COMPUTER INTERACTION WITH MOBILE DEVICES AND SERVICES (MOBILEHCI '17) |
Keywords | Field | DocType |
Finger orientation, yaw, pitch, modeling, depth camera, touch, mobile devices | Little finger,Computer vision,Simulation,Computer science,Finger tracking,Mean squared error,Ground truth,Mobile device,Artificial intelligence,User interface | Conference |
Citations | PageRank | References |
2 | 0.36 | 4 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Sven Mayer | 1 | 188 | 27.30 |
Michael Mayer | 2 | 2 | 1.71 |
Niels Henze | 3 | 1262 | 108.47 |