Presented at IEEE VR 2023
https://doi.org/10.1109/VRW58643.2023.00238 Read Paper Online
Figure 1: An overview of our approach. The algorithm first selects a viewpoint of the virtual scenario as the image input and imports the pre-acquired VR headset optical profile. Then, during the data processing stage, the image is channel split to obtain the monochromatic values of each pixel separately. The FOV and the lens edge-light-loss function described in the profile are extracted and overlapped with the three separate image channels. Lastly, the spectrums and luminance growth curves of the single-pixel red, green and blue light emitting units in the profile are extracted to iteratively calculate the spectrum of each pixel and then summed to generate the total spectrum. The M-EDI value and FPRE maps are subsequently calculated through the generated spectrum.
Virtual Reality (VR) offers a highly immersive simulated environment. While developers have made great efforts to create virtual content, the VR device usage still leads to an unsatisfactory experience and physical side effects with. The average session time of VR headsets is reported to be 21 minutes, with visual fatigue being one of the main barriers to longer session times. Recent studies reveal the existence of a non-image-forming (NIF) visual pathway in the human eye in addition to the imaging vision system for colour perception, which synchronises the biological clock and influences hormonal secretion by sensing external light-dark cycles. NIF visual stimulation has a strong correlation with the triggering of visual fatigue. In order to precisely control stimuli, it is important to evaluate the effect of VR on the NIF visual system in the human eye. Quantifying the NIF stimulation of VR headsets will help to reduce digital eye strain (DES), increase usage time and eliminate sleep disorders caused by nighttime use. VR systems will therefore have the ability to better simulate reality. Not limited to entertainment, the fields of medical research and training will also be enhanced.
In this work, we propose a generic spectrum prediction algorithm for VR scenarios, which doesn't require any setup or additional optical equipment. The algorithm uses the imported pre-acquired optical profiles of various VR headsets, allowing developers to anticipate the spectrum the human eye will receive during the virtual scene construction phase. We also propose the 'Five Photoreceptors Radiation Efficacy' (FPRE) maps, which are generated by converting each pixel's predicted spectrum into irradiance values for the five photoreceptors, to visualise the regional activation of photoreceptors in the human eye by the virtual scene. In summary, our approach significantly reduces the cycle for evaluating the impact of VR on NIF vision. It eliminates the cost and complexity as well as establishes a reference with existing illumination standards. To the best of our knowledge, this is the first algorithmic implementation of spectrum prediction for VR head-mounted systems.
The video above shows a short demo of real-time light stimulus identification using the FPRE map plugin in Unreal Engine 5.