Chinese Journal of Liquid Crystals and Displays, Volume. 38, Issue 9, 1248(2023)
Background defocus method of image perception guided CycleGAN network
The existing image conversion algorithms based on generative adversarial network often extract the features of the whole input image indiscriminately in the process of background defocus, which makes it difficult for the network to distinguish the front and back scenes of the image, so it is easy to lead to the phenomenon of image distortion. We propose a background virtualization method of image perception guided CycleGAN network. The image perception information is introduced to improve the performance of the model. The image perception information includes attention information and depth of field information. The former is used to guide the network to pay attention to different foreground and background areas, so as to distinguish the foreground and background. The latter is used to enhance the perception information of foreground targets, achieve effective intelligent focusing, and reduce image distortion, making the background defocus better. The experimental results and data show that the method proposed in this paper can effectively distinguish the foreground and background in the process of background defocus, reduce the phenomenon of image distortion, and make the generated effect more real. In addition, in comparison with the image effect generated by the existing methods, the questionnaire survey is used for evaluation. A background virtualization method for image perception guided CycleGAN network is proposed, comparing with SOTA, the image quality generated is the best, and its model size and image generation rate also have obvious advantages of 56.10 MB and 47 ms, respectively.
Get Citation
Copy Citation Text
Wu-jian YE, Zhen-yi LIN, Yi-jun LIU, Cheng-min LIU. Background defocus method of image perception guided CycleGAN network[J]. Chinese Journal of Liquid Crystals and Displays, 2023, 38(9): 1248
Category: Research Articles
Received: Dec. 3, 2022
Accepted: --
Published Online: Sep. 19, 2023
The Author Email: Wu-jian YE (yewjian@126.com)