Optics and Precision Engineering, Volume. 21, Issue 3, 767(2013)

Application of depth from defocusing based on logarithmic power spectrum to multi-spectral imager

ZHANG Yan-chao1...2,*, SUN Qiang1 and ZHAO Jian1 |Show fewer author(s)
Author Affiliations
  • 1[in Chinese]
  • 2[in Chinese]
  • show less

    According to the characteristics of a multi-spectral imager, an auto-focusing method was proposed based on the depth from defocusing of logarithmic power spectrum. By making use a CCD as the image sensor, the auto-focusing of the multi-spectral imager was quickly implemented by using a host computer for focusing controlling and data processing. Firstly, the sensor was placed in three equally spaced positions in turn, and it was used to acquire an image in each position. Then, according to the three-point location judgment method, the positional relationship between the second image and accurate focus position was determined. By using the second image as the reference, the related calculation with logarithmic power spectrum of the image was carried out to get the value of accurate focus position. Finally, the sensor was placed to the calculated position, and the auto-focusing process was finished. Experimental results indicate that the standard deviations of the focus position value are ±0.159 9 mm, and the maximum deviation is 0.4 mm or less. The method can meet the real-time auto-focusing requirement better. With the advantages of fast focusing and high accuracy, the auto-focusing process can be realized just by making use of three images.

    Tools

    Get Citation

    Copy Citation Text

    ZHANG Yan-chao, SUN Qiang, ZHAO Jian. Application of depth from defocusing based on logarithmic power spectrum to multi-spectral imager[J]. Optics and Precision Engineering, 2013, 21(3): 767

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category:

    Received: Oct. 19, 2012

    Accepted: --

    Published Online: Apr. 8, 2013

    The Author Email: Yan-chao ZHANG (zhangyanchaomn@126.com)

    DOI:10.3788/ope.20132103.0767

    Topics