Laser & Optoelectronics Progress, Volume. 60, Issue 16, 1610005(2023)
Fine-Grained Fish Disease Image Recognition Algorithm Model
Identification of fish epidemics by the naked eye depends on the experience of diagnostic personnel. Moreover, the epidemic data has such fine granularity problems as small gaps between categories and low recognition efficiency. Usually, the Transformer requires a large amount of data for training due to the lack of inductive bias in convolutional neural networks (CNN). In addition, the model's classification accuracy is restricted by insufficient global feature extraction and the weak generalization performance of CNN. In this study, based on the global interaction of all pixels in the feature map, an algorithm model is developed, and a fish epidemic recognition model (CViT-FDRM) using the combination of CNN and a Vision Transformer is suggested. First, FishData01, a database of fish epidemics, is set up. Second, CNN is used to extract the fine-grain features of fish images, and the Transformer model self-attention mechanism is used to acquire the global information of images for parallel training. Then, the group normalization layer is utilized to group the sample channels to compute the mean and standard deviation. Finally, 404 fish epidemic images were used for testing, and CViT-FDRM obtained 97.02% recognition accuracy. The experimental results on Oxford Flowers, an open-source database of fine-grained images, reveal that CViT-FDRM has greater classification accuracy than that of the standard fine-grained image classification algorithm, reaching 95.42%, which is 4.84 percentage points higher. Therefore, CViT-FDRM can perform well in fine-grain image recognition.
Get Citation
Copy Citation Text
Liming Wei, Kui Zhao, Ning Wang, Zhongyan Zhang, Haipeng Cui. Fine-Grained Fish Disease Image Recognition Algorithm Model[J]. Laser & Optoelectronics Progress, 2023, 60(16): 1610005
Category: Image Processing
Received: Sep. 26, 2022
Accepted: Nov. 23, 2022
Published Online: Aug. 15, 2023
The Author Email: Wei Liming (15865569879@163.com)