Journal of Optoelectronics · Laser, Volume. 35, Issue 8, 851(2024)
Yarn-dyed fabric defect detection based on U-shaped attention gate auto-encoder
The existing unsupervised deep learning algorithms based on auto-encoders and generative adversarial networks have problems such as poor generalizability, high missed and false detection rates in the defect detection task of yarn-dyed fabric. To address these issues, a yarn-dyed fabric defect detection algorithm based on U-shaped attention gate auto-encoder (UAGAE) is proposed. Firstly, the light weight network EfficientNet-B6 is employed as the feature extraction module to capture more representative features from input images. The introduced attention gate (AG) mechanism is used to suppress feature responses in non-target regions, leveraging decoder features as a reference to eliminate redundant information in skip connections, thereby aiding in image reconstruction. Subsequently, during the training phase, a combined loss function is utilized to preserve both the structure and details of the reconstructed images. Finally, during the detection phase, the ultimate detection results are obtained through adaptive threshold segmentation and mathematical morphology operations. The proposed algorithm achieves a precision (P) of 53.45%, recall (R) of 61.58%, F1-measure (F1) of 53.63%, and mean intersection over union (IoU) of 40.83% on the public dataset YDFID-1. Notably, it attains the highest F1 and IoU metrics across 14 different fabric patterns. The comparative experimental results indicate that the UAGAE algorithm, in comparison to several other defect detection algorithms, exhibits a superior capability in effectively performing yarn-dyed fabric defect detection and localization.
Get Citation
Copy Citation Text
ZHANG Yue, WANG Shihao, LI Yingjian, LIU Shuaibo, ZHANG Hongwei. Yarn-dyed fabric defect detection based on U-shaped attention gate auto-encoder[J]. Journal of Optoelectronics · Laser, 2024, 35(8): 851
Category:
Received: Jun. 19, 2023
Accepted: Dec. 13, 2024
Published Online: Dec. 13, 2024
The Author Email: ZHANG Yue (zhangyue@xpu.edu.cn)