Abstract | ||
---|---|---|
Personal computing devices have evolved steadily, from desktops to mobile devices, and now to emerging trends in wearable computing. Wearables are expected to be integral to consumer electronics, with the primary mode of interaction often being a near-eye display. However, current-generation near-eye displays are unable to provide fully natural focus cues for all users, which often leads to discomfort. This core limitation is due to the optics of the systems themselves, with current displays being unable to change focus as required by natural vision. Furthermore, the form factor often makes it difficult for users to wear corrective eyewear. With two prototype near-eye displays, we address these issues using display modes that adapt to the user via computational optics. These prototypes make use of focus-tunable lenses, mechanically actuated displays, and gaze tracking technology to correct common refractive errors per user, and provide natural focus cues by dynamically updating scene depth based on where a user looks. Recent advances in computational optics hint at a future in which some users experience better vision in the virtual world than in the real one. |
Year | DOI | Venue |
---|---|---|
2017 | 10.1145/3084363.3085029 | SIGGRAPH Talks |
Keywords | Field | DocType |
virtual and augmented reality,vision correction,computational optics | Eyewear,Computer vision,Computer graphics (images),Gaze,Wearable computer,Computer science,Mobile device,Electronics,Lens (optics),Artificial intelligence,Multimedia | Conference |
ISBN | Citations | PageRank |
978-1-4503-5008-2 | 0 | 0.34 |
References | Authors | |
4 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Nitish Padmanaban | 1 | 33 | 3.56 |
Robert Konrad | 2 | 42 | 4.88 |
Emily A. Cooper | 3 | 95 | 8.07 |
Gordon Wetzstein | 4 | 945 | 72.47 |