Abstract | ||
---|---|---|
This paper suggests using mobile touchscreen devices to assist students with vision loss in working with data. It presents an integrated approach that combines current sonification methods with interactive multi-touch gesture-based exploration of data, designed to aid students in mental visualization and comprehension of data and function plots. This approach aims to help students with vision loss study independently of support centres, collaborate with their peers, and participate in group studies. Initial user study evaluating this approach and demonstrating its feasibility is presented; further research pathways are also discussed. |
Year | DOI | Venue |
---|---|---|
2014 | 10.1016/j.procs.2014.07.038 | Procedia Computer Science |
Keywords | Field | DocType |
Sonification,auditory displays,multi-touch interfaces,assistive technologies,mobile applications | Computer science,Visualization,Gesture,Touchscreen,Human–computer interaction,Sonification,Multimedia,Comprehension | Conference |
Volume | ISSN | Citations |
34 | 1877-0509 | 0 |
PageRank | References | Authors |
0.34 | 3 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
denis nikitenko | 1 | 0 | 0.34 |
daniel gillis | 2 | 0 | 1.35 |