Title
Implementation and Evaluation of a 50 kHz, <inline-formula><tex-math notation="LaTeX">$28\mu\mathrm{s}$</tex-math><alternatives><graphic orientation="portrait" position="float" xlink:href="25tvcg05-blate-2899233-eqinline-1-small.tif" xmlns:xlink="http://www.w3.org/1999/xlink"/></alternatives></inline-formula> Motion-to-Pose Latency Head Tracking Instrument
Abstract
This paper presents the implementation and evaluation of a 50,000-pose-sample-per-second, 6-degree-of-freedom optical head tracking instrument with motion-to-pose latency of <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"><tex-math notation="LaTeX">$28\mu\mathrm{s}$</tex-math><alternatives><graphic orientation="portrait" position="float" xlink:href="25tvcg05-blate-2899233-eqinline-2-small.tif"/></alternatives></inline-formula> and dynamic precision of 1–2 arcminutes. The instrument uses high-intensity infrared emitters and two duo-lateral photodiode-based optical sensors to triangulate pose. This instrument serves two purposes: it is the first step towards the requisite head tracking component in sub- <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"><tex-math notation="LaTeX">$100\mu\mathrm{s}$</tex-math><alternatives><graphic orientation="portrait" position="float" xlink:href="25tvcg05-blate-2899233-eqinline-3-small.tif"/></alternatives></inline-formula> motion-to-photon latency optical see-through augmented reality (OST AR) head-mounted display (HMD) systems; and it enables new avenues of research into human visual perception – including measuring the thresholds for perceptible real-virtual displacement during head rotation and other human research requiring high-sample-rate motion tracking. The instrument's tracking volume is limited to about <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"><tex-math notation="LaTeX">$120\times 120\times 250$</tex-math><alternatives><graphic orientation="portrait" position="float" xlink:href="25tvcg05-blate-2899233-eqinline-4-small.tif"/></alternatives></inline-formula> but allows for the full range of natural head rotation and is sufficient for research involving seated users. We discuss how the instrument's tracking volume is scalable in multiple ways and some of the trade-offs involved therein. Finally, we introduce a novel laser-pointer-based measurement technique for assessing the instrument's tracking latency and repeatability. We show that the instrument's motion-to-pose latency is <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"><tex-math notation="LaTeX">$28\mu\mathrm{s}$</tex-math><alternatives><graphic orientation="portrait" position="float" xlink:href="25tvcg05-blate-2899233-eqinline-5-small.tif"/></alternatives></inline-formula> and that it is repeatable within 1–2 arcminutes at mean rotational velocities (yaw) in excess of 500°/sec.
Year
DOI
Venue
2019
10.1109/TVCG.2019.2899233
IEEE Transactions on Visualization and Computer Graphics
Keywords
DocType
Volume
Instruments,Target tracking,Optical sensors,Photodiodes,Adaptive optics
Journal
25
Issue
ISSN
Citations 
5
1077-2626
0
PageRank 
References 
Authors
0.34
9
7
Name
Order
Citations
PageRank
Alex Blate100.34
Mary Whitton21119.12
Montek Singh355041.67
G. Welch449030.22
A. State581.73
Turner Whitted6194212.06
Henry Fuchs741661248.53