Laser & Optoelectronics Progress, Volume. 60, Issue 23, 2306003(2023)
Orbital Angular Momentum Multi-Superposition Identification Method Based on Attention Mechanism
In free-space optical communication, vortex-superimposed beams with different radius combinations can transmit more information at the same channel overhead. However, the vortex beam undergoes phase disturbances owing to atmospheric turbulence, affecting the ability to identify its orbital angular momentum (OAM) modes. Existing models cannot precisely identify OAM superimposed beam modes perturbed by random atmospheric turbulence. Therefore, a deep learning recognition method based on attention mechanism is proposed. The attention mechanism module is embedded in VGG-16 to improve the perception performance of the model for superimposed beam modes in different states. In addition, the atmospheric turbulence is simulated using the power spectrum inversion method to simulate the actual state of turbulence, and subharmonics are used to compensate for the low-frequency information of the random turbulence screen. An OAM superimposed beam dataset affected by random turbulence is established, and the proposed model is trained using this dataset. The experimental results show that under the condition of unknown atmospheric turbulence intensity, the accuracy of the proposed method compared to those of traditional methods improves by up to 4.46%. This demonstrates the effectiveness of the model for identifying OAM superimposed beams. In addition, the proposed model exhibits good robustness and generalization ability. This study provides a new method for identifying OAM modes.
Get Citation
Copy Citation Text
Xu Zhou, Chunyi Chen, Haiyang Yu, Xiaolong Ni, Xiaojuan Hu. Orbital Angular Momentum Multi-Superposition Identification Method Based on Attention Mechanism[J]. Laser & Optoelectronics Progress, 2023, 60(23): 2306003
Category: Fiber Optics and Optical Communications
Received: Nov. 18, 2022
Accepted: Dec. 30, 2022
Published Online: Dec. 4, 2023
The Author Email: Chen Chunyi (chenchunyi@hotmail.com)