The Shortest Route Is Not Always the Fastest: Probability-Modeled Stereoscopic Eye Movement Completion Time in VR
The Shortest Route Is Not Always the Fastest: Probability-Modeled Stereoscopic Eye Movement Completion Time in VR
Budmonde Duinkharjav, Benjamin Liang, Anjul Patney, Rachel Brown, Qi SunACM Transactions on Graphics (SIGGRAPH Asia 2023)
PDF Video Code
Abstract
Battery life is an increasingly urgent challenge for today’s untethered VR and AR devices. However, the power efficiency of head-mounted displays is naturally at odds with growing computational requirements driven by better resolution, refresh rate, and dynamic ranges, all of which reduce the sustained usage time of untethered AR/VR devices. For instance, the Oculus Quest 2, under a fully-charged battery, can sustain only 2 to 3 hours of operation time. Prior display power reduction techniques mostly target smartphone displays. Directly applying smartphone display power reduction techniques, however, degrades the visual perception in AR/VR with noticeable artifacts. For instance, the “power-saving mode” on smartphones \emph{uniformly} lowers the pixel luminance across the display and, as a result, presents an overall darkened visual perception to users if directly applied to VR content.
Our key insight is that VR display power reduction must be cognizant of the gaze-contingent nature of high field-of-view VR displays. To that end, we present a gaze-contingent system that, without degrading luminance, minimizes the display power consumption while preserving high visual fidelity when users actively view immersive video sequences. This is enabled by constructing 1) a gaze-contingent color discrimination model through psychophysical studies, and 2) a display power model (with respect to pixel color) through real-device measurements. Critically, due to the careful design decisions made in constructing the two models, our algorithm is cast as a constrained optimization problem with a \emph{closed-form} solution, which can be implemented as a real-time, image-space shader. We evaluate our system using a series of psychophysical studies and large-scale analyses on natural images. Experiment results show that our system reduces the display power by as much as $24\%$ ($14\%$ on average) with little to no perceptual fidelity degradation.
Bibtex
@article{duinkharjav2023shortest,
title={The Shortest Route Is Not Always the Fastest: Probability-Modeled Stereoscopic Eye Movement Completion Time in VR},
author={Duinkharjav, Budmonde and Liang, Benjamin and Patney, Anjul and Brown, Rachel and Sun, Qi},
journal={ACM Transactions on Graphics (TOG)},
volume={42},
number={6},
pages={1–14},
year={2023},
publisher={ACM New York, NY, USA}
}