Performance Analysis of Catch-Up Eye Movements in Visual Tracking

Performance Analysis of Catch-Up Eye Movements in Visual Tracking

Jenna Kang, Budmonde Duinkharjav, Niall L. Williams, Qi Sun
SIGGRAPH Asia 2025
PDF

Abstract

In graphics applications featuring dynamically moving visual targets — such as film and gaming — we have to rotate our eyes to follow objects as they move across the screen. Because target motion is often unpredictable and ever-changing, we must rapidly respond to motion cues and adjust eye movements to maintain the target within the fovea, a process known as catch-up. This catch-up behavior reflects how efficiently the eyes react to and compensate for sudden changes in motion, making it a critical indicator for both task performance and the overall visual experience. In this work, we study and measure the eye catch-up performance during visual tracking. In particular, we present a behavioral analysis that predicts users’ reaction latency to abrupt target motion based on target visibility. Our numerical analysis and human subject studies evidence the effectiveness and generalizability. We further show how the catch-up metric can be applied to evaluate video quality, adjust game difficulty, and optimize display configurations for enhanced user performance. We envision this research to create a computational link between human perception and behavioral performance in dynamic graphics contexts.

Acknowledgement

We would like to thank Sandra Malpica and Kenneth Chen for their fruitful discussion and valuable advice on the experiment and data analysis. This project is partially supported by the National Science Foundation grant #2232817/#2225861 and the DARPA Intrinsic Cognitive Security program. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the funding agencies.