Abstract | ||
---|---|---|
In this study, a real-time, computer vision based sign language recognition system aimed at aiding hearing impaired users in a hospital setting has been developed. By directing them through a tree of questions, the system allows the user to state their purpose of visit by answering between four to six questions. The deaf user can use sign language to communicate with the system, which provides a written transcript of the exchange. A database collected from six users was used for the experiments. User independent tests without using the tree-based interaction scheme yield a 96.67% accuracy among 1257 sign samples belonging to 33 sign classes. The experiments evaluated the effectiveness of the system in terms of feature selection and spatio-temporal modelling. The combination of hand position and movement features modelled by Temporal Templates and classified by Random Decision Forests yielded the best results. The tree- based interaction scheme further increased the recognition performance to more than 97.88 %. |
Year | DOI | Venue |
---|---|---|
2016 | 10.1007/978-3-319-46843-3_6 | HUMAN BEHAVIOR UNDERSTANDING |
Keywords | DocType | Volume |
Sign language recognition, Assistive computer vision, Human computer interaction | Conference | 9997 |
ISSN | Citations | PageRank |
0302-9743 | 0 | 0.34 |
References | Authors | |
14 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Necati Cihan Camgöz | 1 | 39 | 9.23 |
Ahmet Alp Kındıroğlu | 2 | 25 | 3.48 |
lale akarun | 3 | 1201 | 70.68 |