Abstract | ||
---|---|---|
This paper presents the design and implementation of a wearable oral sensory system that recognizes human oral activities, such as chewing, drinking, speaking, and coughing. We conducted an evaluation of this oral sensory system in a laboratory experiment involving 8 participants. The results show 93.8% oral activity recognition accuracy when using a person-dependent classifier and 59.8% accuracy when using a person-independent classifier. |
Year | DOI | Venue |
---|---|---|
2013 | 10.1145/2493988.2494352 | ISWC |
Keywords | Field | DocType |
laboratory experiment,wearable oral sensory system,person-independent classifier,oral activity recognition accuracy,human oral activity,person-dependent classifier,oral sensory system,sensor-embedded tooth,activity recognition | Computer vision,Activity recognition,Embedded teeth,Computer science,Wearable computer,Laboratory experiment,Speech recognition,Artificial intelligence,Sensory system,Classifier (linguistics) | Conference |
Citations | PageRank | References |
10 | 0.93 | 4 |
Authors | ||
5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Cheng-Yuan Li | 1 | 45 | 7.59 |
Yen-Chang Chen | 2 | 133 | 12.35 |
Wei-Ju Chen | 3 | 44 | 3.46 |
Polly Huang | 4 | 886 | 82.35 |
Hao-Hua Chu | 5 | 1168 | 98.54 |