Abstract | ||
---|---|---|
This paper presents SilentKey, a new authentication framework to identify mobile device users through ultrasonic-based lip reading. The main idea is to generate ultrasonic signals from a mobile phone and analyze the fine-grained impact of mouth motions on the reflected signal. The new framework is effective since people have unique characteristics when performing mouth motions, which represent not only what people input, but also how they input. SilentKey is robust against attacks since the input cannot be recorded or imitated. We implement a prototype and demonstrate the effectiveness of the system by fifty volunteers. Such a non-intrusive identification mechanism provides a natural user interface which can also be applied by people with speaking or viewing difficulties.
|
Year | DOI | Venue |
---|---|---|
2018 | 10.1145/3191768 | Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies |
Keywords | DocType | Volume |
Implicit Authentication,Mobile Computing,Ultrasonic Sensing | Journal | 2 |
Issue | ISSN | Citations |
1 | 2474-9567 | 3 |
PageRank | References | Authors |
0.36 | 0 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Jiayao Tan | 1 | 3 | 0.36 |
Xiaoliang Wang | 2 | 10 | 4.25 |
Cam-Tu Nguyen | 3 | 139 | 12.40 |
Yu Shi | 4 | 16 | 4.42 |