Image Features Influence Reaction Time: A Learned Probabilistic Perceptual Model for Saccade Latency
Image Features Influence Reaction Time: A Learned Probabilistic Perceptual Model for Saccade Latency
Budmonde Duinkharjav, Praneeth Chakravarthula, Rachel Brown, Anjul Patney, Qi Sun[★ Best Paper Award ★] ACM Transactions on Graphics (SIGGRAPH 2022)
PDF Video Supp Materials NVIDIA Page Code
Abstract
We aim to ask and answer an essential question “how quickly do we react after observing a displayed visual target?” To this end, we present psychophysical studies that characterize the remarkable disconnect between human saccadic behaviors and spatial visual acuity. Building on the results of our studies, we develop a perceptual model to predict temporal gaze behavior, particularly saccadic latency, as a function of the statistics of a displayed image. Specifically, we implement a neurologically-inspired probabilistic model that mimics the accumulation of confidence that leads to a perceptual decision. We validate our model with a series of objective measurements and user studies using an eye-tracked VR display. The results demonstrate that our model prediction is in statistical alignment with real-world human behavior. Further, we establish that many sub-threshold image modifications commonly introduced in graphics pipelines may significantly alter human reaction timing, even if the differences are visually undetectable. Finally, we show that our model can serve as a metric to predict and alter reaction latency of users in interactive computer graphics applications, thus may improve gaze-contingent rendering, design of virtual experiences, and player performance in e-sports. We illustrate this with two examples: estimating competition fairness in a video game with two different team colors, and tuning display viewing distance to minimize player reaction time.
Acknowledgement
The authors would like to thank Chris Wyman for his valuable suggestions. The research is partially supported by the NVIDIA Applied Research Accelerator Program and the DARPA PTG program. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of DARPA.
Bibtex
@article{Duinkharjav:2022:IFI,
author = {Duinkharjav, Budmonde and Chakravarthula, Praneeth and Brown, Rachel and Patney, Anjul and Sun, Qi},
title = {Image Features Influence Reaction Time: A Learned Probabilistic Perceptual Model for Saccade Latency},
year = {2022},
issue_date = {July 2022},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
volume = {41},
number = {4},
issn = {0730-0301},
url = {https://doi.org/10.1145/3528223.3530055},
doi = {10.1145/3528223.3530055},
journal = {ACM Trans. Graph.},
month = {jul},
articleno = {144},
numpages = {15},
keywords = {virtual reality, human performance, esports, gaze-contingent rendering, visual perception, augmented reality}
}
Media Coverage:
- The Power of Visual Influence [EurekAlert! Science News]
- Predicting How Images Influence Visual Reaction Speed [NVIDIA Technical Blog]
Related Projects:
- Larger Visual Changes Compress Time, Malpica, et al., PLoS One 2022 [link]
- Perceptually-Guided Foveation For Light Field Displays, ACM SIGGRAPH Asia 2017 [link]
- Instant Reality: Gaze-Contingent Perceptual Optimization For 3D Virtual Reality Streaming, IEEE TVCG 2022 [link]
- Leveraging Human Visual Perception for an Optimized Virtual Reality Experience, IEEE CG&A 2021 [link]