Infrared and Laser Engineering, Volume. 50, Issue 11, 20210021(2021)
Light field depth estimation of fusing consistency and difference constraints
[1] [1] Ren Ng, Marc Levoy, Mathieu Brédif, et al. Light field photography with a hheld plenoptic camera[D]. US: Stanfd University, 2005.
[2] [2] Lytro. The lytro camera[EBOL]. [20111021] https:www.lytro.com.
[3] [3] Raytrix.3d light field camera technology[EBOL]. [20130505] https: www. raytrix. de.
[4] Wang Jiahua, Du Shaojun, Zhang Xuanzhe, et al. Design of focused light field computational imaging system with four-types focal lengths[J]. Infrared and Laser Engineering, 48, 0218003(2019).
[5] Zhang Xuanzhe, Wang Yan, Wang Jiahua, et al. Image clarification and point cloud calculation under turbulence by light field camera[J]. Infrared and Laser Engineering, 49, 20200053(2020).
[6] Bolles R C, Baker H H, Ma- rimont D H. Epipolar-plane image analysis: An approach to determining structure from motion[J]. International Journal of Computer Vision, 1, 7-55(1987).
[7] [7] Jeon H G, Park J, Choe G, et al. Accurate depth map estimation from a lenslet light field camera[C]Proceedings of the IEEE Conference on Computer Vision Pattern Recognition, 2015: 15471555.
[8] [8] Chen C, Lin H, Yu Z, et al. Light field stereo matching using bilateral statistics of surface cameras[C]Proceedings of the IEEE Conference on Computer Vision Pattern Recognition, 2014: 15181525.
[9] Zhang S, Sheng H, Yang D, et al. Micro-lens-based matching for scene recovery in lenslet cameras[J]. IEEE Transactions on Image Processing, 27, 1060-1075(2017).
[10] Fan Xiaoting, Li Yi, Luo Xiaowei, et al. Depth estimation based on light field structure characteristic and multiview matching[J]. Infrared and Laser Engineering, 48, 0524001(2019).
[11] [11] Tao M W, Hadap S, Malik J, et al. Depth from combining defocus crespondence using lightfield cameras[C]Proceedings of the IEEE International Conference on Computer Vision, 2013: 673680.
[12] [12] Tao M W, Srinivasan P P, Malik J, et al. Depth from shading, defocus, crespondence using lightfield angular coherence[C]Proceedings of the IEEE Conference on Computer Vision Pattern Recognition, 2015: 19401948.
[13] [13] Wang T C, Efros A A, Ramamothi R. Occlusionaware depth estimation using lightfield cameras[C]Proceedings of the IEEE International Conference on Computer Vision, 2015: 34873495.
[14] [14] Williem W, Park I K. Robust light field depth estimation f noisy scene with occlusion[C]Proceedings of the IEEE Conference on Computer Vision Pattern Recognition, 2016: 43964404.
[15] Park I K, Lee K M. Robust light field depth estimation using occlusion-noise aware data costs[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 40, 2484-2497(2017).
[16] Guo Z, Wu J, Chen X, et al. Accurate light field depth estimation using multi-orientation partial angular coherence[J]. IEEE Access, 7, 169123(2019).
[17] Sheng H, Zhang S, Cao X, et al. Geometric occlusion analysis in depth estimation using integral guided filter for light-field image[J]. IEEE Transactions on Image Processing, 26, 5758-5771(2017).
[18] Chen J, Hou J, Ni Y, et al. Accurate light field depth estimation with superpixel regularization over partially occluded regions[J]. IEEE Transactions on Image Processing, 27, 4889-4900(2018).
[19] [19] Wanner S, Goldluecke B. Globally consistent depth labeling of 4D light fields[C]2012 IEEE Conference on Computer Vision Pattern Recognition, 2012: 4148.
[20] Wanner S, Goldluecke B. Variational light field analysis for disparity estimation and super-resolution[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 36, 606-619(2013).
[21] Kim C, Zimmer H, Pritch Y, et al. Scene reconstruction from high spatio-angular resolution light fields[J]. ACM Trans Graph, 32, 1-73(2013).
[22] [22] Huang Z, Lin C W, Shao H C, et al. Consistency constrained reconstruction of depth maps from epipolar plane images[C]ICASSP 20192019 IEEE International Conference on Acoustics, Speech Signal Processing (ICASSP), 2019: 22922296.
[23] Zhang S, Sheng H, Li C, et al. Robust depth estimation for light field via spinning parallelogram operator[J]. Computer Vision and Image Understanding, 145, 148-159(2016).
[24] Sheng H, Zhao P, Zhang S, et al. Occlusion-aware depth estimation for light field using multi-orientation EPIs[J]. Pattern Recognition, 74, 587-599(2018).
[25] [25] Li J, Jin X. EPIneighbhood distribution based light field depth estimation[C]ICASSP 20202020 IEEE International Conference on Acoustics, Speech Signal Processing (ICASSP), 2020: 20032007.
[26] [26] Schilling H, Diebold M, Rother C, et al. Trust your model: Light field depth estimation with inline occlusion hling[C]Proceedings of the IEEE Conference on Computer Vision Pattern Recognition, 2018: 45304538.
[27] Martin D R, Fowlkes C C, Malik J. Learning to detect natural image boundaries using local brightness, color, and texture cues[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 26, 530-549(2004).
[28] [28] He K, Sun J, Tang X. Guided image filtering[C]European Conference on Computer Vision, 2010: 114.
[29] Boykov Y, Veksler O, Zabih R. Fast approximate energy minimization via graph cuts[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 23, 1222-1239(2001).
[30] [30] Honauer K, Johannsen O, Kondermann D, et al. A dataset evaluation methodology f depth estimation on 4D light fields[C]Asian Conference on Computer Vision, 2016: 1934.
Get Citation
Copy Citation Text
Zeyang He, Huiping Deng, Sen Xiang, Jin Wu. Light field depth estimation of fusing consistency and difference constraints[J]. Infrared and Laser Engineering, 2021, 50(11): 20210021
Category: Image processing
Received: May. 25, 2021
Accepted: --
Published Online: Dec. 7, 2021
The Author Email: