Laser & Optoelectronics Progress, Volume. 59, Issue 18, 1810007(2022)
Real-Time Indoor Scene Layout Estimation Based on Improved Lightweight Network
This study proposes a real-time layout estimation method based on an improved lightweight network to simplify the network structure of layout estimation and improve the use of output features. A lightweight coding and decoding network was used to obtain the main plane segmentation images of indoor scenes directly end-to-end and realize real-time layout estimation. Aiming at the problem of low feature usage in previous joint learning methods, a simplified joint learning module was introduced and the gradient of the output segmentation graph was used as the output edge. Additionally, the loss of the edge was directly integrated into the output loss of the entire network to improve feature utilization and simplify the joint learning network. Aiming at the imbalance of positive and negative labels of dataset and the imbalance of layout type distribution, to improve the stability of network training, segmentation semantic transfer was used to initialize the network parameters in this paper using the semantic segmentation network parameters trained on the LSUN dataset. The performance of the proposed method was evaluated using two benchmark datasets. The results show that the average pixel error of the proposed method is 7.35% and 8.32% on the LSUN and Hedau datasets, respectively. Ablation experiments prove the effectiveness of hierarchical supervision, simplified joint learning, and semantic transfer mechanism for improving accuracy. Finally, the experimental results show that the proposed method can estimate accurate indoor scene layout in real-time.
Get Citation
Copy Citation Text
Youjun Yue, Jie Zhang, Hui Zhao, Hongjun Wang. Real-Time Indoor Scene Layout Estimation Based on Improved Lightweight Network[J]. Laser & Optoelectronics Progress, 2022, 59(18): 1810007
Category: Image Processing
Received: Jul. 8, 2021
Accepted: Jul. 28, 2021
Published Online: Aug. 31, 2022
The Author Email: Zhang Jie (2580690058@qq.com)