Abstract | ||
---|---|---|
Mobile devices significantly reshape our various aspects of livings. Yet prolonged contacts with mobile devices may cause eye and/or muscle fatigues especially for young children. In this paper, we consider the integration of web cameras as image sensors available on most tablets or smartphones with an interesting tracking algorithm to continuously monitor and analyze the learners' responses through their facial orientations and eye movements to build the PErsonalized Teaching And Learning, namely the PETAL, platform for nurturing the academic development of our young learners while protecting their eyesight. Through the in-depth studies of various Android programming toolkits with the Open Source Computer Vision library, we explore many possible ways to detect the viewers' responses to educational videos as a mean of self-learning. With the capability of notifying learners of their, possibly unconscious, reactions to such educational videos, our platform is targeted to promote a truly personalized approach for developing the next-generation e-learning systems. |
Year | DOI | Venue |
---|---|---|
2014 | 10.1007/978-3-662-44188-6_17 | Lecture Notes in Educational Technology |
Keywords | Field | DocType |
Eye Tracking Algorithms,Facial Recognition Techniques,Mobile Devices,Personalized Learning,Smart Sensors | Android (operating system),E learning,Mobile device,Eye movement,Personalized learning,Engineering,Multimedia | Conference |
ISSN | Citations | PageRank |
2196-4963 | 0 | 0.34 |
References | Authors | |
5 | 5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Kelly Liu | 1 | 0 | 0.34 |
Victoria Tam | 2 | 4 | 0.78 |
Phoebe Tse | 3 | 0 | 0.34 |
edmund y lam | 4 | 683 | 69.87 |
Vincent Tam | 5 | 195 | 32.75 |