https://www.immersivecomputinglab.org/wp-content/uploads/2024/09/papers_384s2-file2-e1725420191448.jpg
333
500
Qi Sun
https://www.immersivecomputinglab.org/wp-content/uploads/2021/01/nyu-logo-1.jpg
Qi Sun2024-09-04 03:50:242024-09-05 02:21:18Evaluating Visual Perception of Object Motion in Dynamic Environments
https://www.immersivecomputinglab.org/wp-content/uploads/2024/07/Video1-1.gif
638
602
Qi Sun
https://www.immersivecomputinglab.org/wp-content/uploads/2021/01/nyu-logo-1.jpg
Qi Sun2024-07-04 22:37:302024-09-30 03:15:50GazeFusion: Saliency-guided Image Generation
https://www.immersivecomputinglab.org/wp-content/uploads/2024/06/peapods-2.png
1051
1491
Qi Sun
https://www.immersivecomputinglab.org/wp-content/uploads/2021/01/nyu-logo-1.jpg
Qi Sun2024-06-06 17:31:052024-08-04 00:57:28PEA-PODs: Perceptual Evaluation of Algorithms for Power Optimization in XR Displays
https://www.immersivecomputinglab.org/wp-content/uploads/2024/05/accel.jpg
333
500
Qi Sun
https://www.immersivecomputinglab.org/wp-content/uploads/2021/01/nyu-logo-1.jpg
Qi Sun2024-06-05 17:31:372024-08-04 00:15:26Accelerating Saccadic Response through Spatial and Temporal Cross-Modal Misalignments
https://www.immersivecomputinglab.org/wp-content/uploads/2024/08/Screenshot-2024-08-03-181340-e1722730445279.png
254
400
Qi Sun
https://www.immersivecomputinglab.org/wp-content/uploads/2021/01/nyu-logo-1.jpg
Qi Sun2024-06-01 22:43:012024-08-04 00:14:28May the Force Be with You: Dexterous Finger Force-Aware XR Interface
https://www.immersivecomputinglab.org/wp-content/uploads/2024/06/papers_493s2-file2-e1722730201346.jpg
267
400
Qi Sun
https://www.immersivecomputinglab.org/wp-content/uploads/2021/01/nyu-logo-1.jpg
Qi Sun2024-06-01 22:42:212024-09-26 23:45:13Toward User-Aware Interactive Virtual Agents: Generative Multi-Modal Avatar Behaviors in VR
https://www.immersivecomputinglab.org/wp-content/uploads/2024/06/papers_144s2-file2-scaled-e1718664718812.jpg
110
186
Qi Sun
https://www.immersivecomputinglab.org/wp-content/uploads/2021/01/nyu-logo-1.jpg
Qi Sun2024-06-01 22:41:562024-07-17 16:31:23Measuring and Predicting Multisensory Reaction Latency: A Probabilistic Model for Visual-Auditory Integration
https://www.immersivecomputinglab.org/wp-content/uploads/2024/02/visual_audio.png
522
1129
Qi Sun
https://www.immersivecomputinglab.org/wp-content/uploads/2021/01/nyu-logo-1.jpg
Qi Sun2024-02-05 22:36:112024-03-12 18:48:28Modeling the Impact of Head-Body Rotations on Audio-Visual Spatial Perception for Virtual Reality Applications
https://www.immersivecomputinglab.org/wp-content/uploads/2023/08/color_sys.png
535
908
Qi Sun
https://www.immersivecomputinglab.org/wp-content/uploads/2021/01/nyu-logo-1.jpg
Qi Sun2024-02-01 03:04:182024-02-06 18:07:14Exploiting Human Color Discrimination for Memory- and Energy-Efficient Image Encoding in Virtual Reality
https://www.immersivecomputinglab.org/wp-content/uploads/2024/01/autocolor-1.png
212
300
Qi Sun
https://www.immersivecomputinglab.org/wp-content/uploads/2021/01/nyu-logo-1.jpg
Qi Sun2024-01-01 19:03:292024-05-05 19:12:39AutoColor: Learned Light Power Control for Multi-Color Holograms
Scroll to top