Predicting the Light Spectrum of Virtual Reality Scenarios for Non-Image-Forming Visual Evaluation

Virtual Reality (VR) offers a highly immersive simulated environment. While developers have made great efforts to create virtual content, the VR device usage still leads to an unsatisfactory experience and physical side effects with. The average session time of VR headsets is reported to be 21 minutes, with visual fatigue being one of the main barriers to longer session times. Recent studies reveal the existence of a non-image-forming (NIF) visual pathway in the human eye in addition to the imaging vision system for colour perception, which synchronises the biological clock and influences hormonal secretion by sensing external light-dark cycles. NIF visual stimulation has a strong correlation with the triggering of visual fatigue. In order to precisely control stimuli, it is important to evaluate the effect of VR on the NIF visual system in the human eye. Quantifying the NIF stimulation of VR headsets will help to reduce digital eye strain (DES), increase usage time and eliminate sleep disorders caused by nighttime use. VR systems will therefore have the ability to better simulate reality. Not limited to entertainment, the fields of medical research and training will also be enhanced.
In this work, we propose a generic spectrum prediction algorithm for VR scenarios, which doesn't require any setup or additional optical equipment. The algorithm uses the imported pre-acquired optical profiles of various VR headsets, allowing developers to anticipate the spectrum the human eye will receive during the virtual scene construction phase. We also propose the 'Five Photoreceptors Radiation Efficacy' (FPRE) maps, which are generated by converting each pixel's predicted spectrum into irradiance values for the five photoreceptors, to visualise the regional activation of photoreceptors in the human eye by the virtual scene. In summary, our approach significantly reduces the cycle for evaluating the impact of VR on NIF vision. It eliminates the cost and complexity as well as establishes a reference with existing illumination standards. To the best of our knowledge, this is the first algorithmic implementation of spectrum prediction for VR head-mounted systems.
The video above shows a short demo of real-time light stimulus identification using the FPRE map plugin in Unreal Engine 5.