Title
Beyond blur: real-time ventral metamers for foveated rendering
Abstract
AbstractTo peripheral vision, a pair of physically different images can look the same. Such pairs are metamers relative to each other, just as physically-different spectra of light are perceived as the same color. We propose a real-time method to compute such ventral metamers for foveated rendering where, in particular for near-eye displays, the largest part of the framebuffer maps to the periphery. This improves in quality over state-of-the-art foveation methods which blur the periphery. Work in Vision Science has established how peripheral stimuli are ventral metamers if their statistics are similar. Existing methods, however, require a costly optimization process to find such metamers. To this end, we propose a novel type of statistics particularly well-suited for practical real-time rendering: smooth moments of steerable filter responses. These can be extracted from images in time constant in the number of pixels and in parallel over all pixels using a GPU. Further, we show that they can be compressed effectively and transmitted at low bandwidth. Finally, computing realizations of those statistics can again be performed in constant time and in parallel. This enables a new level of quality for foveated applications such as such as remote rendering, level-of-detail and Monte-Carlo denoising. In a user study, we finally show how human task performance increases and foveation artifacts are less suspicious, when using our method compared to common blurring.
Year
DOI
Venue
2021
10.1145/3450626.3459943
ACM Transactions on Graphics
Keywords
DocType
Volume
Foveated Rendering, Head-mounted displays, Texture synthesis, Human Visual perception
Conference
40
Issue
ISSN
Citations 
4
0730-0301
1
PageRank 
References 
Authors
0.35
0
7
Name
Order
Citations
PageRank
David R. Walton142.11
Rafael Kuffner dos Anjos2128.79
Sebastian Friston3282.74
David Swapp415614.24
Kaan Aksit5606.34
Anthony Steed63502353.97
Tobias Ritschel7105266.60