Laser & Optoelectronics Progress, Volume. 57, Issue 18, 182802(2020)
Hyperspectral Image Classification Combined with Convolutional Neural Network and Sparse Coding
Conventional hyperspectral image classification only considers ground objects' spectral information and ignores the spatial information. The existing space-spectrum joint classification methods are difficult to effectively extract the spatial neighborhood information. To address these problems, this paper proposes a method that combines a convolutional neural network with a sparse dictionary. Most existing sparse coding methods only consider the spectral information and discard the spatial information. The proposed method leverages the advantages of a convolutional neural network to effectively extract the deep data features, simultaneously extracts the spatial-spectral features of hyperspectral images, obtains the high-dimensional deep features, and then applies sparse coding to the deep features through dictionary learning to obtain the identification features for classification. The classification results are confirmed using the classifier. Experiments are conducted in which three open datasets are classified using the proposed method and five existing algorithms. The proposed method outperforms the other methods in terms of overall classification accuracy, average classification accuracy, and the Kappa coefficient. The experimental results demonstrate that the proposed method can simultaneously extract the spatial-spectral features of hyperspectral data, has good robustness and discrimination, effectively improves classification accuracy, and performs well on a dataset with a small number of samples.
Get Citation
Copy Citation Text
Jinguang Sun, Yanbei Li, Xian Wei, Wanli Wang. Hyperspectral Image Classification Combined with Convolutional Neural Network and Sparse Coding[J]. Laser & Optoelectronics Progress, 2020, 57(18): 182802
Category: Remote Sensing and Sensors
Received: Dec. 30, 2019
Accepted: Feb. 10, 2020
Published Online: Sep. 2, 2020
The Author Email: Li Yanbei (13147887613@163.com)