Laser & Optoelectronics Progress, Volume. 59, Issue 22, 2230001(2022)
Three dimensional-CNN Classification Method of Mural Multispectral Image Pigments Based on Multiscale Feature Fusion
The classification and recognition of pigments is the basis of ancient mural protection and restoration. The multispectral imaging method can quickly obtain and analyze the spectral image data of mural pigments without damage. Continuous convolution and pooling operations in the traditional convolutional neural network feature extraction algorithm will lose part of the feature information of the fresco multispectral image, making the image details unable to be reconstructed, resulting in an unsmooth boundary of the classified image. To solve this problem, a three-dimensional hole convolution residual neural network based on multiscale feature fusion is proposed to classify multispectral mural images. To begin, the hole structure is introduced into the convolution kernel to improve the receptive field and extract different scale information to avoid the loss of some features caused by the pooling operation. Second, the feature fusion method is used to combine images of different scales. Finally, a multilevel gradient of the feature map is introduced to prevent the edge from disappearing. On the multispectral image dataset of simulated murals, the experimental results show that the proposed method's overall accuracy and average accuracies are 98.87% and 96.89%, respectively. The proposed method not only outperforms the control groups in classification accuracy, but it also produces classification images with clearer boundaries.
Get Citation
Copy Citation Text
Yunle Ding, Huiqin Wang, Ke Wang, Zhan Wang, Gang Zhen. Three dimensional-CNN Classification Method of Mural Multispectral Image Pigments Based on Multiscale Feature Fusion[J]. Laser & Optoelectronics Progress, 2022, 59(22): 2230001
Category: Spectroscopy
Received: Mar. 9, 2022
Accepted: May. 9, 2022
Published Online: Oct. 26, 2022
The Author Email: Wang Huiqin (hqwang@xauat.edu.cn)