Acta Optica Sinica, Volume. 44, Issue 13, 1312002(2024)
Digital Photoelastic Stress Field Analysis Method Based on SANet
The digital photoelasticity method combines optics and digital image processing technology. Digital image processing and numerical calculation can help achieve accurate analysis of optical interference patterns, thereby obtaining accurate stress distribution information. It is of significance for stress analysis problems in scientific research, engineering design, and material testing fields. However, the current digital photoelasticity method adopts a divide and conquer approach, dividing the entire stage into several substeps such as phase shifting, phase unwrapping, and stress separation. Each substep requires high experimental environments such as noise, and the calculation accuracy of each stage is limited by the calculation results of the previous stage. Thus, immediate errors generated in each stage will be introduced into the final stress component. With the development of artificial intelligence, deep learning has gradually been applied to digital photoelasticity methods. However, current deep learning models only involve some research on calculating stress differences, and traditional stress separation methods are still needed to calculate normal stress and shear stress. Therefore, we propose a multi-branch deep learning model based on an encoder-decoder and a simulation dataset construction method for stress analysis tasks. This model improves the efficiency and robustness of stress component calculation while ensuring accuracy.
The proposed method mainly utilizes the feature extraction ability of convolutional neural networks. Based on the improvement of UNet, residual blocks are employed to replace the convolutional modules of the encoder and decoder, accelerating the convergence speed and improving the feature expression ability of the model. Multiple output layers are added in the output part to adapt to the stress component calculation task. Meanwhile, a simulation dataset is generated using the theory formula of radial compression discs, and the dataset is expanded by operations such as rotation, translation, and cropping to provide data-driven support for SANet. Finally, L2 loss is adopted as the loss function for each branch of the neural network, and the weighted sum of the loss functions for the three branches is leveraged to calculate the total loss.
The experimental results on the simulation test set indicate that SANet can calculate the normal stress and shear stress (Fig. 5). In the noise experiment, our model achieves the highest MSE, PSNR, and SSIM (Table 3). The model is tested using a noise test set with a mean of 0 and an increasing standard deviation (Fig. 7), which indicates that our model has strong noise tolerance. Finally, tests are conducted on real data (Fig. 8). Compared with traditional phased processing methods, this method can avoid the phase unwrapping and stress separation stages that are prone to errors, and achieve stress component calculation in one step.
We propose a deep learning method for calculating stress components. This method introduces residual connections based on UNet and changes the output part to a multi-branch output structure to adapt to the stress component calculation task. To train the model, we construct a simulation training set using the radial compression disc formula and data augmentation methods. Additionally, the comparison is conducted between the two phased methods and the proposed method on simulation test sets, noise test sets, and real test data. The results show that compared to traditional phased processing methods, SANet has the highest accuracy and better robustness in calculating stress components.
Get Citation
Copy Citation Text
Haoxing He, Niannian Chen, Ling Wu, Yong Fan, Xuejiao Zhang, Chuan Qiu. Digital Photoelastic Stress Field Analysis Method Based on SANet[J]. Acta Optica Sinica, 2024, 44(13): 1312002
Category: Instrumentation, Measurement and Metrology
Received: Jan. 9, 2024
Accepted: Mar. 21, 2024
Published Online: Jul. 4, 2024
The Author Email: Chen Niannian (chenniannian@swust.edu.cn)