Acta Optica Sinica, Volume. 44, Issue 5, 0533001(2024)

Dynamic Distortion Assessment in Automobile Head-Up Displays with Subjective Methods

Tao Wang and Haifeng Li*
Author Affiliations
  • State Key Laboratory of Extreme Photonics and Instrumentation, College of Optical Science and Engineering, Zhejiang University, Hangzhou 310027, Zhejiang, China
  • show less

    Objective

    In the automobile augmented reality head-up display (AR-HUD) optical system, due to its imaging performance and the non-standard shape of the windshield on the last imaging surface, the image observed by the driver will have some distortion. Meanwhile, as the viewpoint changes in the eyebox, the distortion will also be different at various eyebox positions, which will cause great trouble to the driver's perception during driving. At present, there have been a lot of studies on the distortion correction of AR-HUD, such as employing algorithm correction or adding optimization functions during optical design. However, the above-mentioned distortion correction methods are all for distortion correction at a single viewpoint. The binocular fusion process of human eyes is not involved. Since the image observed by the driver is essentially the fusion of distorted images of the left and right eyes at different eyebox positions, monocular correction alone cannot well represent the experience of the drivers during the binocular fusion processes. Therefore, it is necessary to conduct corresponding subjective experiments to evaluate the actual perception of the driver during the fusion process of different viewpoint images and provide certain constraints of dynamic distortion for the optical design process.

    Methods

    We adopt homogeneity of variance test, one-way ANOVA, and statistical chart analysis. First, we conduct a basic theoretical explanation of dynamic distortion and how to simulate dynamic distortion, build a dynamic distortion experimental simulation model, and synthesize a series of display images for later subjective experiments. Then we utilize the subjective experimental scale and carry out a subjective experiment for dynamic distortion evaluation. Experimental data from multiple subjects in different group conditions are collected in this section. Finally, statistical methods are leveraged to analyze previously obtained data, with one-way ANOVA and chart analysis processed in this section. Additionally, significant difference results and line-bar charts are employed to simultaneously analyze the experimental data quantitatively and visually find the relationship between dynamic distortion and drivers' subjective perception.

    Results and Discussions

    A total of three sets of results of 12 subjects for vertical, horizontal, and rotational distortion are calculated using the homogeneous test of variances. As shown in Table 1, the Levene statistics are 2.301, 0.988, and 1.401, respectively, and the corresponding difference significance values are 0.051, 0.435, and 0.241, respectively, all greater than 0.05, indicating that the statistical quantities of the three sets of data have homogeneity of variances and the F test can be adopted to perform one-way ANOVA. Then three groups of data are subjected to single-factor variance analysis with VIMSL in subjective experiments. In the ANOVA analysis results, the significant difference values of vertical distortion and rotational distortion are both 0, less than 0.01. The results show that the factor dynamic distortion level has a significant effect on the VIMSL increments. However, the significant difference value of horizontal distortion is greater than 0.05, which means that the changes in the horizontal direction have a small effect on drivers' perception and there is no obvious significant effect on the VIMSL increments. In the statistical chart analysis (Figs. 10-11), as the distortion level increases, VIMSL-related evaluation indicators rise accordingly. There is a more obvious difference between Group 2 and Group 3, which shows that drivers' discomfort increases most significantly during the switching process between the two groups. This means that vertical distortion of 2% and horizontal distortion of 1% can be regarded as a value at which obvious discomfort begins to occur. However, the SSQ scores do not change significantly before and after viewing, which shows that the influence of the experimental equipment on subjects' discomfort can be ignored and the display condition of the experimental equipment itself is relatively reliable.

    Conclusions

    We establish a subjective experimental procedure based on binocular 3D display observation and the subjective experimental data are adopted to analyze the subjective feelings of drivers caused by dynamic distortion in automobile AR-HUD devices. Meanwhile, the certain value of the distortion level that is acceptable to the drivers during binocular fusion when drivers are watching the images from different eyebox positions is evaluated. The experimental results show that different forms and levels of dynamic distortion both have a great effect on the driver's subjective perception. As the difference in dynamic distortion between the two eyes rises, it becomes increasingly more difficult for the driver to fuse the images, with rapidly increased discomfort level. Furthermore, we also reveal that the certain levels for the dynamic distortion acceptable to the driver at two different positions of the same eyebox are vertical distortion less than 2% and horizontal distortion less than 1%. The results show that the combination of different distortions has a great effect on the driver's subjective perception. Additionally, the experimental results also provide a clear design constraint index for dynamic distortion correction during the HUD optical design.

    Tools

    Get Citation

    Copy Citation Text

    Tao Wang, Haifeng Li. Dynamic Distortion Assessment in Automobile Head-Up Displays with Subjective Methods[J]. Acta Optica Sinica, 2024, 44(5): 0533001

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Vision, Color, and Visual Optics

    Received: Nov. 24, 2023

    Accepted: Dec. 29, 2023

    Published Online: Mar. 15, 2024

    The Author Email: Li Haifeng (lihaifeng@zju.edu.cn)

    DOI:10.3788/AOS231831

    Topics