Laser & Optoelectronics Progress, Volume. 57, Issue 2, 21012(2020)
Unsupervised Monocular Depth Estimation for Autonomous Flight of Drones
Fig. 4. Loss function of each part of training process. (a) Structural similarity loss of reconstructed image and original image; (b) absolute value loss of difference between reconstructed image and original image; (c) total image reconstruction loss; (d) loss of disparity smoothness; (e) loss of consistency in left and right disparity maps; (f) total loss of our model
Fig. 5. Platform of drone experiment. (a) Drone; (b) connection of NVIDIA Jeston TX2 and Pixhawk
Fig. 6. Examples of depth map predicted on KITTI dataset. (a) Input image; (b) ground truth depth map; (c) depth map predicted by Ref. [15] ; (d) depth map predicted in Ref. [20]; (e) depth map predicted by our model based on VGG-16; (f) depth map predicted by our model based on ResNet-50
Fig. 7. Examples of depth map predicted in real outdoor scenes. (a) Input images; (b) ground truth depth maps
|
|
Get Citation
Copy Citation Text
Zhao Shuanfeng, Huang Tao, Xu Qian, Geng Longlong. Unsupervised Monocular Depth Estimation for Autonomous Flight of Drones[J]. Laser & Optoelectronics Progress, 2020, 57(2): 21012
Category: Image Processing
Received: May. 27, 2019
Accepted: --
Published Online: Jan. 3, 2020
The Author Email: Huang Tao (775628393@qq.com)