Abstract | ||
---|---|---|
We examined the minimum latency required to locate and identify a visual target (visual search) in a two-alternative forced-choice paradigm in which the visual target could appear from any azimuth (0 degrees to 360 degrees) and from a broad range of elevations (from 90 degrees above to 70 degrees below the horizon) relative to a person's initial line of gaze. Seven people were tested in six conditions: unaided search, three aurally aided search conditions, and two visually aided search conditions. Aurally aided search with both actual and virtual sound localization cues proved to be superior to unaided and visually guided search. Application of synthesized three-dimensional and two-dimensional sound cues in the workstations are discussed. |
Year | DOI | Venue |
---|---|---|
1996 | 10.1518/001872096778827260 | HUMAN FACTORS |
Keywords | Field | DocType |
signals,visual search,free field,virtual reality,workstations | Free field,Visual search,Computer vision,Virtual reality,Gaze,Active listening,Speech recognition,Sound localization,Artificial intelligence,Engineering | Journal |
Volume | Issue | ISSN |
38 | 4 | 0018-7208 |
Citations | PageRank | References |
17 | 4.47 | 0 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
David R. Perrott | 1 | 17 | 4.47 |
John Cisneros | 2 | 17 | 4.47 |
Richard L. McKinley | 3 | 30 | 6.79 |
William R. D'Angelo | 4 | 36 | 7.49 |