Laser & Optoelectronics Progress, Volume. 55, Issue 2, 021001(2018)

Method of Vegetation Extraction Based on Deep Belief Network and Optimal Scale

Zujin Liu1, Ling Yang1、*, Zuhan Liu1,2, Linlin Duan1, Xianxian Qiao1, and Jiaojiao Gong1
Author Affiliations
  • 1 College of Environment and Planning, Henan University, Kaifeng, Henan 475004, China
  • 1 Key Laboratory of Poyang Lake Wetland and Watershed Research, Ministry of Education, Jiangxi Normal University, Nanchang, Jiangxi 330022, China
  • 2 Jiangxi Province Key Laboratory of Water Information Cooperative Sensing and Intelligent Processing, Nanchang Institute of Technology, Nanchang, Jiangxi 330099, China
  • show less

    When using the existing methods of depth learning to study the vegetation extraction, there are some problems that the adjacent objects are in the same window, and some useless crushing plots and the salt and pepper phenomenon appear. We propose a method by combining the optimal segmentation scale with the deep belief network to study the vegetation extraction, and comparison experiments are carried out with spectral-texture features and other information. Experimental results show that the overall accuracy of the proposed method is 91.92% and the Kappa coefficient is 0.8677, and the proposed method can effectively improve the classification accuracy compared with the existing deep learning methods. The classification results show that the proposed method can effectively reduce the salt and pepper phenomenon, and clear express the boundaries of objects.

    Tools

    Get Citation

    Copy Citation Text

    Zujin Liu, Ling Yang, Zuhan Liu, Linlin Duan, Xianxian Qiao, Jiaojiao Gong. Method of Vegetation Extraction Based on Deep Belief Network and Optimal Scale[J]. Laser & Optoelectronics Progress, 2018, 55(2): 021001

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Image processing

    Received: Jun. 26, 2017

    Accepted: --

    Published Online: Sep. 10, 2018

    The Author Email: Yang Ling (yangling0606@163.com)

    DOI:10.3788/LOP55.021001

    Topics