Acta Optica Sinica, Volume. 43, Issue 19, 1911004(2023)

Disparity Estimation Method Based on Focal Stack Disparity Dimension Super-Resolution

Yukai Wang, Chang Liu*, and Jun Qiu
Author Affiliations
  • Institute of Applied Mathematics, Beijing Information Science and Technology University, Beijing 100101, China
  • show less

    Objective

    Focal stack is a projection domain representation model of the four-dimensional (4D) light field, which can be used in disparity estimation, light field reconstruction, extended depth of field imaging, and other fields. The accuracy and robustness of computational imaging based on focal stack data depend on the disparity dimensional resolution of the focal stack data. There are two ways to obtain focal stack images. The first one is to directly capture them on multiple disparity planes by imaging equipment, and the second one is to use digital refocusing methods to generate multiple images of different disparity layers. As capturing focal stack images by the imaging equipment needs to set the focal length and other parameters beforehand, and the focal stack with high quality and high disparity resolution can only be obtained by strictly controlling the imaging plane during the capturing process, while the digital refocusing method requires 4D light field data, resulting in computational redundancy. In view of the problem of insufficient resolution of disparity dimension in focal stack data, a method of focal stack super-resolution in disparity dimension was proposed. According to the disparity dimension spectrum optimization of the focal stack data, we proposed the focal stack disparity dimension filter and the disparity dimension super-resolution method of the focal stack data to estimate the disparity with high accuracy and robustness.

    Methods

    The focal stack spectrum contains the disparity dimension spectrum, whereby focal stack data can be processed in the disparity dimension. In this paper, based on the disparity dimension spectrum optimization of focal stack data, a focal stack disparity dimension filter was introduced, and a focal stack disparity dimension super-resolution method based on disparity dimension filtering was then proposed to achieve high-precision and dense disparity estimation. Through the spectral analysis of the focal stack, the Butterworth filter was selected as the disparity dimension filter to achieve high-fidelity disparity dimension super-resolution of the focal stack data. Dense and high-precision disparity estimation was achieved based on the robust focus volume regularization (RFV) algorithm by using the dense focal stack after disparity dimension super-resolution.

    Results and Discussions

    In the simulated data experiment, a focal stack containing 16 images was first generated by the light field projection method, and then a focal stack containing 151 images was obtained by super-resolution through the proposed method (Fig. 5). The RFV algorithm was applied for disparity estimation (Fig. 6). In the experiment, Butterworth filter parameter was set to be K=6. By comparing the disparity estimation results of other data, including the focal stack before disparity-dimensional super-resolution (Table 1), the focal stack obtained with the light field projection method, and the Fourier parallax layer (FDL) generation method (Fig. 9 and Table 2), the peak signal to noise ratio (PSNR) and structural similarity (SSIM) values by the proposed method were larger than those before disparity-dimensional super-resolution and were close to those by the focal stack obtained with the light field projection method and the FDL generation method. Finally, we utilized the Butterworth filter with different K values to obtain the focal stack for disparity estimation (Fig. 11) and then compared the PSNR and SSIM values (Table 3) of the disparity estimation results, and it was found that the real values at K=0.6 and K=60 were both smaller. In the measured data experiment, six images in the focal stack containing 31 images were selected to form the focal stack with sparse disparity dimension, and then the Butterworth filter with different K values was used for disparity dimension super-resolution (Figs. 12 and 13). By comparing the obtained focal stack with the original data (Fig. 14), the PSNR and SSIM values of some focal stack images in the super-resolution results at K=0.6 and K=60 were significantly smaller than those at K=6. Then we implemented the disparity estimation (Fig. 15) and selected the profiles of the disparity map for comparison (Fig. 16). It can be seen that the disparity profiles obtained by the proposed method were smoother than that before super-resolution and were closer to the disparity obtained from the original data.

    Conclusions

    The results of simulated data experiments and real data experiments show that the method of focal stack disparity dimension super-resolution proposed in this paper can effectively improve the disparity resolution of focal stacks and provide data for applications such as disparity estimation. The experimental results of simulated data show that the disparity estimation result of the focal stack obtained by the proposed method is more accurate and robust than the result before super-resolution, and it can obtain high-fidelity and high-disparity resolution focal stack data and realize dense disparity estimation. The dense disparity estimation is achieved based on the RFV algorithm by using the dense focal stack after the disparity dimension super-resolution. The experimental results of simulated and real data show that disparity dimension-based filtering can achieve efficient disparity dimension super-resolution, as well as high-precision and dense disparity estimation.

    Tools

    Get Citation

    Copy Citation Text

    Yukai Wang, Chang Liu, Jun Qiu. Disparity Estimation Method Based on Focal Stack Disparity Dimension Super-Resolution[J]. Acta Optica Sinica, 2023, 43(19): 1911004

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Imaging Systems

    Received: Mar. 29, 2023

    Accepted: Apr. 24, 2023

    Published Online: Sep. 28, 2023

    The Author Email: Liu Chang (liu.chang.cn@ieee.org)

    DOI:10.3788/AOS230727

    Topics