Optics and Precision Engineering, Volume. 32, Issue 4, 622(2024)
Multi-spectral image compression by fusing multi-scale feature convolutional neural networks
Unlike ordinary image compression, multispectral image compression needs to remove spatial redundancy as well as inter-spectral redundancy. Recent studies show that the end-to-end convolutional neural network model has a very good performance in image compression, but for multispectral image compression, its codecs cannot effectively solve the problem of efficiently extracting spatial and inter-spectral features of multispectral images at the same time, and it neglects the localized feature information of the image. The localized feature information of the image is also neglected. To address the above problems, this paper proposed a multispectral image compression method that incorporates a convolutional neural network with multiscale features. The proposed network embeds that can extract spatial and inter-spectral feature information at different scales, and an inter-spectral spatial asymmetric convolution module that can be used to capture local spatial and spectral information. Experiments show that the Peak Signal to Noise Ratio(PSNR) metrics of the proposed model are 1-2 dB higher than those of the traditional algorithms such as JPEG2000 and 3D-SPIHT as well as the deep learning methods on the 7-band of Landsat-8 and 8-band of Sentinel-2 datasets. Regarding the Mean Spectral Angle(MSA) metrics, the proposed model is more effective on the Landsat-8 dataset and outperforms the traditional algorithm by about 8×10-3 rad. The proposed model outperforms the traditional algorithm by about 2×10-3 rad on the Sentinel-2 dataset. The requirements of multispectral image compression for spatial and inter-spectral feature extraction as well as localized feature extraction are satisfied.
Get Citation
Copy Citation Text
Lili ZHANG, Zikun CHEN, Tianpeng PAN, Lele QU. Multi-spectral image compression by fusing multi-scale feature convolutional neural networks[J]. Optics and Precision Engineering, 2024, 32(4): 622
Category:
Received: Jul. 20, 2023
Accepted: --
Published Online: Apr. 2, 2024
The Author Email: CHEN Zikun (chenzikun1@stu.sau.edu.com)