Optics and Precision Engineering, Volume. 27, Issue 2, 450(2019)

Auto-focusing method of push-broom hyperspectral camera

WEI Gui-hua1,*... XIAO Liang2, and ZHENG Zhi-zhong34 |Show fewer author(s)
Author Affiliations
  • 1[in Chinese]
  • 2[in Chinese]
  • 3[in Chinese]
  • 4[in Chinese]
  • show less

    The auto-focusing method has been widely incorporated in various optical instruments. To address the issue of auto-focusing for push-broom hyperspectral cameras, a method was proposed for estimating the optimal focus of a camera by employing a spectral quality evaluation function based on quaternion wavelet transform. A single line spectral data of the push sweep line was transformed into a two-dimensional matrix, and the matrix was subsequently decomposed into four low frequency and high frequency sub-bands. The spectral quality evaluation function was developed by incorporating the low frequency and high frequency amplitude and phase information for realizing the auto-focus feature. During the focusing process, the spectral quality metrics were calculated for different foci by rotating the lens at a certain stride length through an auto-focus mechanism. A Gaussian distribution function was fit between the spectral quality metrics and the extent of lens expansion and contraction, and the optimal focus was thereby estimated. Obtained results indicate that the proposed spectral quality evaluation function has superior sensitivity and accuracy. Furthermore, the proposed auto-focus method has superior accuracy owing to the usage of a single line of spectral data for focusing.

    Tools

    Get Citation

    Copy Citation Text

    WEI Gui-hua, XIAO Liang, ZHENG Zhi-zhong. Auto-focusing method of push-broom hyperspectral camera[J]. Optics and Precision Engineering, 2019, 27(2): 450

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category:

    Received: Jul. 16, 2018

    Accepted: --

    Published Online: Apr. 2, 2019

    The Author Email: Gui-hua WEI (984323595@qq.com)

    DOI:10.3788/ope.20192702.0450

    Topics