Optics and Precision Engineering, Volume. 30, Issue 20, 2538(2022)
Overview of visual pose estimation methods for space missions
[1] [1] 1王延东. 多位姿信息融合的双目视觉惯性里程计研究[D]. 北京: 中国科学院大学, 2019: 3-11.WANGY D. Study on Binocular Visual Inertial Odometry of Multi-pose Information Fusion[D]. Beijing: University of Chinese Academy of Sciences, 2019: 3-11.(in Chinese)
[2] [2] 2郝颖明, 付双飞, 范晓鹏, 等. 面向空间机械臂在轨服务操作的视觉感知技术[J]. 无人系统技术, 2018, 1(1): 54-65.HAOY M, FUS F, FANX P, et al. Vision perception technology for space manipulator on-orbit service operations[J]. Unmanned Systems Technology, 2018, 1(1): 54-65.(in Chinese)
[3] [3] 3刘冬雨, 刘宏, 何宇, 等. 空间机械臂在轨维修的视觉伺服操控策略[J]. 机器人, 2018, 40(5): 742-749.LIUD Y, LIUH, HEY, et al. Visual servoing control strategy for on-orbit servicing of space manipulator system[J]. Robot, 2018, 40(5): 742-749.(in Chinese)
[4] [4] 4冯骁, 卢山, 侯月阳, 等. 多臂空间机器人的视觉伺服与协调控制[J]. 宇航学报, 2018, 39(2): 206-215. doi: 10.3873/j.issn.1000-1328.2018.02.011FENGX, LUS, HOUY Y, et al. Visual servoing and coordination control of multi-arm space robot[J]. Journal of Astronautics, 2018, 39(2): 206-215.(in Chinese). doi: 10.3873/j.issn.1000-1328.2018.02.011
[5] [5] 5颜坤. 基于双目视觉的空间非合作目标姿态测量技术研究[D]. 成都: 中国科学院大学, 2018: 2-14.YANK. Research on Pose Measurement of Space non Cooperative Target Based on Binocular Vision[D]. Chengdu: University of Chinese Academy of Sciences, 2018: 2-14.(in Chinese)
[6] [6] 6谢晓娜, 刘曦, 常政威, 等. 电力机器人的自主导航与视觉辅助定位融合方法[J]. 四川电力技术, 2020, 43(3): 48-52.XIEX N, LIUX, CHANGZ W, et al. A vision assisted positioning method with autonomous navigation of electric robot[J]. Sichuan Electric Power Technology, 2020, 43(3): 48-52.(in Chinese)
[7] [7] 7张皓焱, 刘新. 基于视觉处理的四旋翼无人机自主导航系统设计[J]. 无线互联科技, 2020, 17(9): 35-40. doi: 10.3969/j.issn.1672-6944.2020.09.016ZHANGH Y, LIUX. Autonomous navigation system of quadrotor UAV based on vision processing[J]. Wireless Internet Technology, 2020, 17(9): 35-40.(in Chinese). doi: 10.3969/j.issn.1672-6944.2020.09.016
[8] [8] 8侯永宏, 刘艳, 吕华龙, 等. 一种基于双目视觉的无人机自主导航系统[J]. 天津大学学报(自然科学与工程技术版), 2019, 52(12): 1262-1269.HOUY H, LIUY, LÜH L, et al. An autonomous navigation systems of UAVs based on binocular vision[J]. Journal of Tianjin University (Science and Technology), 2019, 52(12): 1262-1269.(in Chinese)
[9] [9] 9侯博文, 王炯琦, 周海银, 等. 基于陆标图像的火星绕飞自主导航方法[J]. 控制理论与应用, 2019, 36(12): 1988-1996.HOUB W, WANGJ Q, ZHOUH Y, et al. Autonomous navigation method of Mars orbit based on landmarks[J]. Control Theory & Applications, 2019, 36(12): 1988-1996.(in Chinese)
[10] [10] 10徐超, 王大轶, 黄翔宇. 基于陆标图像的火星精确着陆自主导航方法研究[J]. 深空探测学报, 2016, 3(2): 150-155.XUC, WANGD Y, HUANGX Y. Autonomous navigation for Mars pin-point landing based on landmark image[J]. Journal of Deep Space Exploration, 2016, 3(2): 150-155.(in Chinese)
[11] [11] 11王大轶, 徐超, 黄翔宇. 深空探测着陆过程序列图像自主导航综述[J]. 哈尔滨工业大学学报, 2016, 48(4): 1-12. doi: 10.11918/j.issn.0367-6234.2016.04.001WANGD Y, XUC, HUANGX Y. Overview of autonomous navigation based on sequential images for planetary landing[J]. Journal of Harbin Institute of Technology, 2016, 48(4): 1-12.(in Chinese). doi: 10.11918/j.issn.0367-6234.2016.04.001
[12] [12] 12朱遵尚. 基于三维地形重建与匹配的飞行器视觉导航方法研究[D]. 长沙: 国防科学技术大学, 2014: 5-16.ZHUZ S. Research on vision-based navigation methods for aircrafts using 3D terrain reconstruction and matching[D]. Changsha: National University of Defense Technology, 2014: 5-16.(in Chinese)
[13] [13] 13王琳, 张珊珊, 潘艳飞, 等. 工业机器人视觉检测系统研究[J]. 电子技术与软件工程, 2019(22): 86-88.WANGL, ZHANGS S, PANY F, et al. Research on vison-based target recognition methods for robots [J]. Electronic Technology & Software Engineering, 2019(22): 86-88.(in Chinese)
[14] [14] 14张伟, 韩宗旺, 程祥, 等. 基于机器视觉零件轴线直线度误差测量的研究[J]. 光学 精密工程, 2021, 29(9): 2168-2177. doi: 10.37188/OPE.20212909.2168ZHANGW, HANZ W, CHENGX, et al. Research on straightness error measurement of part axis based on machine vision[J]. Opt. Precision Eng., 2021, 29(9): 2168-2177.(in Chinese). doi: 10.37188/OPE.20212909.2168
[15] [15] 15李洋, 程智, 周维虎, 等. 面向工业复杂场景的合作靶标椭圆特征快速鲁棒检测[J]. 光学 精密工程, 2021, 29(8): 1910-1920. doi: 10.37188/OPE.20212908.1910LIY, CHENGZ, ZHOUW H, et al. A fast and robust method for detecting elliptical character of cooperative targets in industrial complex background[J]. Opt. Precision Eng., 2021, 29(8): 1910-1920.(in Chinese). doi: 10.37188/OPE.20212908.1910
[16] [16] 16吴海滨, 徐恺阳, 于双, 等. 增强现实手术导航系统的投影显示技术综述[J]. 光学 精密工程, 2021, 29(9): 2019-2038. doi: 10.37188/OPE.20212909.2019WUH B, XUK Y, YUS, et al. Review of projection display technology in augmented reality surgical navigation system[J]. Opt. Precision Eng., 2021, 29(9): 2019-2038.(in Chinese). doi: 10.37188/OPE.20212909.2019
[17] [17] 17曲晓, 吴乃蓬. 计算机视觉在医疗领域的应用效果观察[J]. 中国卫生产业, 2020, 17(13): 168-170.QUX, WUN P. Observation of the effect of computer vision in the medical field[J]. China Health Industry, 2020, 17(13): 168-170.(in Chinese)
[18] [18] 18罗小依, 张莉君, 贺晓斌, 等. 空间交会对接位姿测量中特征靶标快速识别[J]. 电子技术应用, 2019, 45(10): 83-87.LUOX Y, ZHANGL J, HEX B, et al. Fast recognition of characteristic targets in spatial rendezvous and docking posture measurement[J]. Application of Electronic Technique, 2019, 45(10): 83-87.(in Chinese)
[19] [19] 19鄂薇. 自主交会中非合作航天器及其柔索结构视觉测量方法研究[D]. 哈尔滨: 哈尔滨工业大学, 2019: 4-8. doi: 10.21629/jsee.2019.03.13E W. Vision measurement method of noncooperative spacecraft and flexible tether in automatic rendezvous[D]. Harbin: Harbin Institute of Technology, 2019: 4-8.(in Chinese). doi: 10.21629/jsee.2019.03.13
[20] [20] 20王明明, 罗建军, 袁建平, 等. 空间在轨装配技术综述[J]. 航空学报, 2021, 42(1): 523913. doi: 10.7527/S1000-6893.2020.23913WANGM M, LUOJ J, YUANJ P, et al. In-orbit assembly technology: review[J]. Acta Aeronautica et Astronautica Sinica, 2021, 42(1): 523913.(in Chinese). doi: 10.7527/S1000-6893.2020.23913
[21] [21] 21贾庆轩, 段嘉琪, 陈钢. 机器人在轨装配无标定视觉伺服对准方法[J]. 航空学报, 2021, 42(6): 424063.JIAQ X, DUANJ Q, CHENG. Uncalibrated visual servo of space robots performing on-orbit assembly alignment task[J]. Acta Aeronautica et Astronautica Sinica, 2021, 42(6): 424063.(in Chinese)
[22] [22] 22陈梅, 车尚岳. 无标定视觉伺服机器人跟踪控制策略研究[J]. 控制工程, 2019, 26(6): 1055-1059.CHENM, CHES Y. Research on tracking control strategy of uncalibrated robot visual servo system[J]. Control Engineering of China, 2019, 26(6): 1055-1059.(in Chinese)
[23] [23] 23刘冬雨, 刘宏, 李志奇. 空间机械臂手系统在轨精细维修操作的标定策略[J]. 宇航学报, 2017, 38(6): 630-637. doi: 10.3873/j.issn.1000-1328.2017.06.010LIUD Y, LIUH, LIZ Q. Calibration strategy of space manipulator system on-orbit servicing fine operation[J]. Journal of Astronautics, 2017, 38(6): 630-637.(in Chinese). doi: 10.3873/j.issn.1000-1328.2017.06.010
[24] [24] 24马兰. 机械臂在轨服务的遥操作及交互视景处理[D]. 西安: 西安电子科技大学, 2019: 2-8.MAL. On-orbit servicing mission planning and modeling visualization for space intelligent robot[D]. Xi'an: Xidian University, 2019: 2-8.(in Chinese)
[25] [25] 25武冠群. 在轨服务航天器交会轨迹优化与近距离安全接近控制研究[D]. 哈尔滨: 哈尔滨工业大学, 2019: 1-9.WUG Q. Research on rendezvous trajectory optimization and close-range safe approach control of spacecraft for on-orbit servicing[D]. Harbin: Harbin Institute of Technology, 2019: 1-9.(in Chinese)
[26] [26] 26于大为. 面向在轨服务的环状目标视觉仿真与检测[D]. 哈尔滨: 哈尔滨工业大学, 2019: 17-22.YUD W. Vision simulation and detection of circular target for on-orbit service[D]. Harbin: Harbin Institute of Technology, 2019: 17-22.(in Chinese)
[27] [27] 27陆野, 马国鹭, 曾国英, 等. 遮挡区域空间位姿的多传感组合测量方法研究[J]. 应用光学, 2020, 41(3): 565-570. doi: 10.5768/jao202041.0303004LUY, MAG L, ZENGG Y, et al. Research on multi-sensor combination method for estimating relative pose[J]. Journal of Applied Optics, 2020, 41(3): 565-570.(in Chinese). doi: 10.5768/jao202041.0303004
[28] [28] 28杜小平, 赵诗玥, 宋一铄. 国外空间目标操控相对位姿测量技术比较研究[J]. 装备学院学报, 2013, 24(5): 58-62. doi: 10.3783/j.issn.1673-0127.2013.05.013DUX P, ZHAOS Y, SONGY S. Comparative studies on the relative-pose measurements in foreign space operations for targets[J]. Journal of Academy of Equipment, 2013, 24(5): 58-62.(in Chinese). doi: 10.3783/j.issn.1673-0127.2013.05.013
[29] [29] 29周志鹏. 固态微波毫米波器件技术进展[J]. 微波学报, 2020, 36(1): 74-77.ZHOUZ P. The development of microwave and millimeter-wave semiconductor device[J]. Journal of Microwaves, 2020, 36(1): 74-77.(in Chinese)
[30] [30] 30王盛杰. 基于面阵三维成像激光雷达的目标姿态测量[D]. 成都: 电子科技大学, 2020: 105-128.WANGS J. Pose measurement of the target based on planar array 3D imaging LIDAR[D]. Chengdu: University of Electronic Science and Technology of China, 2020: 105-128.(in Chinese)
[31] [31] 31李向宇. 基于空间目标对接环的视觉测量算法设计与实现[D]. 哈尔滨: 哈尔滨工业大学, 2018: 2-5.LIX Y. Design and implementation of the visual measurement algorithm based on the docking ring of space objects[D]. Harbin: Harbin Institute of Technology, 2018: 2-5.(in Chinese)
[32] [32] 32吴贤权, 尹仕斌, 任永杰, 等. 多目视觉定向天线位姿测量[J]. 自动化与仪器仪表, 2019(5): 1-6.WUX Q, YINS B, RENY J, et al. Pose measurement of directional antenna based on multi-vision[J]. Automation & Instrumentation, 2019(5): 1-6.(in Chinese)
[33] [33] 33王程, 陈峰, 吴金建, 等. 视觉传感机理与数据处理进展[J]. 中国图象图形学报, 2020, 25(1): 19-30. doi: 10.11834/jig.190404WANGC, CHENF, WUJ J, et al. Progress in mechanism and data processing of visual sensing[J]. Journal of Image and Graphics, 2020, 25(1): 19-30.(in Chinese). doi: 10.11834/jig.190404
[34] [34] 34黄雪莲. 基于视觉测量的非合作目标航天器相对导航算法研究[D]. 哈尔滨: 哈尔滨工业大学, 2020: 3-7.HUANGX L. Research on relative navigation algorithm of non-cooperative spacecraft based on vision measurement[D]. Harbin: Harbin Institute of Technology, 2020: 3-7.(in Chinese)
[35] [35] 35田丰巧. 非合作目标接触式姿轨量测装置制导控制设计[D]. 哈尔滨: 哈尔滨工业大学, 2018: 3-5.TIANF Q. Guindance control design of contact attitude-track measuring device for non-cooperative target[D]. Harbin: Harbin Institute of Technology, 2018: 3-5.(in Chinese)
[36] [36] 36吕耀宇. 空间合作目标单目视觉位姿测量技术研究[D]. 北京: 中国科学院大学(中国科学院长春光学精密机械与物理研究所), 2018: 1-8.LVY Y. Research on mono-vision pose measurement for space cooperative target[D]. Beijing: Institute of Physics, Chinese Academy of Sciences, 2018: 1-8.(in Chinese)
[37] [37] 37刘崇超. 空间翻滚非合作目标位姿及运动参数视觉识别技术研究[D]. 哈尔滨: 哈尔滨工业大学, 2017: 2-8.LIUC C. Research on visual recognition technology of pose and motion parameters of tumbling and noncooperative target in space[D]. Harbin: Harbin Institute of Technology, 2017: 2-8.(in Chinese)
[38] [38] 38孙永军, 王钤, 刘伊威, 等. 空间非合作目标捕获方法综述[J]. 国防科技大学学报, 2020, 42(3): 74-90. doi: 10.11887/j.cn.202003010SUNY J, WANGQ, LIUY W, et al. A survey of non-cooperative target capturing methods[J]. Journal of National University of Defense Technology, 2020, 42(3): 74-90.(in Chinese). doi: 10.11887/j.cn.202003010
[39] [39] 39BELANDS, DUPUISE, DUNLOPJ, 等. 加拿大的空间机器人: 从国际空间站上的灵巧作业机器人到行星探测机器人[J]. 控制工程, 2001, 27(2): 22-29.BELANDS, DUPUISE, DUNLOPJ, et al. Canada’s space robots-from dexterous robots on the International Space Station to planetary exploration robots [J]. Control Engineering, 2001, 27(2): 22-29.(in Chinese)
[40] [40] 40周剑勇, 张波, 蒋自成, 等. 遥操作交会对接系统研究[J]. 国防科技大学学报, 2012, 34(3): 24-28. doi: 10.3969/j.issn.1001-2486.2012.03.005ZHOUJ Y, ZHANGB, JIANGZ C, et al. Researches on teleoperation rendezvous and docking system[J]. Journal of National University of Defense Technology, 2012, 34(3): 24-28.(in Chinese). doi: 10.3969/j.issn.1001-2486.2012.03.005
[41] C H DELEE, P BARFKNECHT, S BREON et al. Techniques for on-orbit cryogenic servicing. Cryogenics, 64, 289-294(2014).
[42] [42] 42吴伟仁, 周建亮, 王保丰, 等. 嫦娥三号“玉兔号”巡视器遥操作中的关键技术[J]. 中国科学: 信息科学, 2014, 44(4): 425-440.WUW R, ZHOUJ L, WANGB F, et al. Key technologies in the teleoperation of Chang'E-3 “jade rabbit” rover[J]. Scientia Sinica (Informationis), 2014, 44(4): 425-440.(in Chinese)
[43] [43] 43朱仁璋, 王鸿芳, 徐宇杰, 等. 从ETS-Ⅶ到HTV: 日本交会对接/停靠技术研究[J]. 航天器工程, 2011, 20(4): 6-31.ZHUR Z, WANGH F, XUY J, et al. From ETS-Ⅶ to HTV: study of Japanese rendezvous and docking/berthing technologies[J]. Spacecraft Engineering, 2011, 20(4): 6-31.(in Chinese)
[44] S STAMM, P MOTAGHEDI. Orbital express capture system: concept to reality, 5419, 78-91(2004).
[45] [45] 45周建平. 天宫一号/神舟八号交会对接任务总体评述[J]. 载人航天, 2012, 18(1): 1-5. doi: 10.3969/j.issn.1674-5825.2012.01.001ZHOUJ P. A review of Tiangong-1/Shenzhou-8 rendezvous and docking mission[J]. Manned Spaceflight, 2012, 18(1): 1-5.(in Chinese). doi: 10.3969/j.issn.1674-5825.2012.01.001
[46] T DEBUS, S DOUGHERTY. Overview and performance of the front-end robotics enabling near-term demonstration (FREND) robotic arm, 1870(2009).
[47] [47] 47陈罗婧, 郝金华, 袁春柱, 等. “凤凰”计划关键技术及其启示[J]. 航天器工程, 2013, 22(5): 119-128. doi: 10.3969/j.issn.1673-8748.2013.05.019CHENL J, HAOJ H, YUANC Z, et al. Key technology analysis and enlightenment of phoenix program[J]. Spacecraft Engineering, 2013, 22(5): 119-128.(in Chinese). doi: 10.3969/j.issn.1673-8748.2013.05.019
[48] B BISCHOF. Roger - robotic geostationary orbit restorer(2003).
[49] S BERND. Automation and robotics in the German space program-unmanned on-orbit servicing(OOS)& the TECSAS mission(2004).
[50] C KAISER, F SJÖBERG, J M DELCURA et al. SMART-OLEV-An orbital life extension vehicle for servicing commercial spacecrafts in GEO. Acta Astronautica, 63, 400-410(2008).
[51] S I NISHIDA, S KAWAMOTO, Y OKAWA et al. Space debris removal system using a small satellite. Acta Astronautica, 65, 95-102(2009).
[52] [52] 52王红. 火星探测多光谱相机定标技术研究[D]. 西安: 中国科学院大学, 2020: 2-18.WANGH. Research on Calibration Technology of Multi-spectral Camera for Mars Exploration[D]. Xi'an: University of Chinese Academy of Sciences, 2020: 2-18.(in Chinese)
[53] S ESTABLE, C PRUVOST, E FERREIRA et al. Capturing and deorbiting Envisat with an Airbus Spacetug. Results from the ESA e.Deorbit consolidation phase study. Journal of Space Safety Engineering, 7, 52-66(2020).
[54] [54] 54洪裕珍. 空间非合作目标的单目视觉姿态测量技术研究[D]. 成都: 中国科学院光电技术研究所, 2017: 2-7.HONGY Z. Research on Pose Estimation for Space Non-cooperative Targets Based on Monocular Vision[D]. Chengdu: Institute of Optics and Electronics, Chinese Academy of Sciences, 2017: 2-7.(in Chinese)
[55] [55] 55徐云飞, 张笃周, 王立, 等. 一种卷积神经网络非合作目标姿态测量方法[J]. 宇航学报, 2020, 41(5): 560-568. doi: 10.3873/j.issn.1000-1328.2020.05.006XUY F, ZHANGD Z, WANGL, et al. A non-cooperative target attitude measurement method based on convolutional neural network[J]. Journal of Astronautics, 2020, 41(5): 560-568.(in Chinese). doi: 10.3873/j.issn.1000-1328.2020.05.006
[56] [56] 56陈彦彤. 基于局部不变特征的遥感图像星上目标识别技术研究[D]. 北京: 中国科学院长春光学精密机械与物理研究所, 2017: 15-35.CHENY T. Research on satellite image on-board target recognition based on local invariant feature[D]. Beijing: Institute of Physics, Chinese Academy of Sciences, 2017: 15-35.(in Chinese)
[57] [57] 57董晶. 模板图像快速可靠匹配技术研究[D]. 长沙: 国防科学技术大学, 2015: 3-15.DONGJ. Study on fast and reliable pattern match[D]. Changsha: National University of Defense Technology, 2015: 3-15.(in Chinese)
[58] [58] 58张振杰. 无人机视觉导航位姿估计技术研究与实践[D]. 郑州: 解放军信息工程大学, 2017: 41-67.ZHANGZ J. Research and practice of pose estimation in the vision-based navigation for UAV[D]. Zhengzhou: PLA Information Engineering University, 2017: 41-67.(in Chinese)
[59] [59] 59王立, 顾营迎. 基于特征建模的空间非合作目标姿态智能测量方法[J]. 空间控制技术与应用, 2019, 45(4): 19-24, 63. doi: 10.3969/j.issn.1674-1579.2019.04.003WANGL, GUY Y. An intelligent pose measurement method for spatial non-cooperative targets based on character modeling[J]. Aerospace Control and Application, 2019, 45(4): 19-24, 63.(in Chinese). doi: 10.3969/j.issn.1674-1579.2019.04.003
[60] K U SHARMA, N V THAKUR. A review and an approach for object detection in images. International Journal of Computational Vision and Robotics, 7, 196(2017).
[61] [61] 61张德, 李国璋, 王怀光, 等. 基于深度学习的目标位姿估计方法综述[J]. 飞航导弹, 2019(9): 72-76.ZHANGD, LIG Z, WANGH G, et al. Overview of target pose estimation methods based on deep learning [J]. Aerodynamic Missile Journal, 2019(9): 72-76.(in Chinese)
[62] [62] 62尉震行. 目标识别算法综述[J]. 中国设备工程, 2019(1): 94-97. doi: 10.3969/j.issn.1671-0711.2019.01.046YUZ X. Review of target recognition [J]. China Plant Engineering, 2019(1): 94-97.(in Chinese). doi: 10.3969/j.issn.1671-0711.2019.01.046
[63] [63] 63刘宗明, 牟金震, 张硕, 等. 空间失效慢旋卫星视觉特征跟踪与位姿测量[J]. 航空学报, 2021, 42(1): 524163.LIUZ M, MUJ Z, ZHANGS, et al. Visual feature tracking and pose measurement for slow rotating failure satellites[J]. Acta Aeronautica et Astronautica Sinica, 2021, 42(1): 524163.(in Chinese)
[64] [64] 64钟坤. 空间目标位姿的视觉测量技术研究[D]. 合肥: 中国科学技术大学, 2020: 2-16.ZHONGK. Research on visual measurement technology of space target pose[D]. Hefei: University of Science and Technology of China, 2020: 2-16.(in Chinese)
[65] [65] 65周跃琪. 基于卷积神经网络YOLO的车辆压实线检测方法研究[D]. 杭州: 浙江科技学院, 2020: 14-22.ZHOUY Q. Research on vehicle compaction line detection method based on convolutional neural network YOLO[D]. Hangzhou: Zhejiang University of Science & Technology, 2020: 14-22.(in Chinese)
[66] [66] 66欧攀, 路奎, 张正, 等. 基于Mask RCNN的目标识别与空间定位[J]. 计算机测量与控制, 2019, 27(6): 172-176.OU P, LUK, ZHANGZ, et al. Target recognition and spatial location based on mask RCNN[J]. Computer Measurement & Control, 2019, 27(6): 172-176.(in Chinese)
[67] [67] 67王春哲. 基于候选区域目标检测的关键技术研究[D]. 北京: 中国科学院大学, 2020: 49-61. doi: 10.17285/0869-7035.0033WANGC Z. Research on key technologies of object detection based on region proposals[D]. Beijing: University of Chinese Academy of Sciences, 2020: 49-61.(in Chinese). doi: 10.17285/0869-7035.0033
[68] L PASQUALETTO CASSINIS, R FONOD, E GILL. Review of the robustness and applicability of monocular pose estimation systems for relative navigation with an uncooperative spacecraft. Progress in Aerospace Sciences, 110, 100548(2019).
[69] J REDMON, S DIVVALA, R GIRSHICK et al. You only look once: unified, real-time object detection, 779-788(2016).
[70] W LIU, D ANGUELOV, D ERHAN et al. SSD: single shot multiBox detector, 21-27(2016).
[71] C Y FU, W LIU, A RANGA et al. DSSD : deconvolutional single shot detector(2017).
[72] J JEONG, H PARK, N KWAK. Enhancement of SSD by concatenating feature maps for object detection. arXiv preprint arXiv:, 2017.
[73] J REDMON, A FARHADI. YOLO9000: better, faster, stronger, 6517-6525(2017).
[74] J REDMON, A FARHADI. YOLO9000: better, faster, stronger, 6517-6525(2017).
[75] A BOCHKOVSKIY, C Y WANG, H LIAO. YOLOv4: Optimal speed and accuracy of object detection(2020).
[76] R GIRSHICK, J DONAHUE, T DARRELL et al. Rich feature hierarchies for accurate object detection and semantic segmentation, 580-587(2014).
[77] R GIRSHICK. Fast R-CNN, 1440-1448(2015).
[78] S Q REN, K M HE, R GIRSHICK et al. Faster R-CNN: towards real-time object detection with region proposal networks. IEEE Transactions on Pattern Analysis and Machine Intelligence, 39, 1137-1149(2017).
[79] K M HE, G GKIOXARI, P DOLLÁR et al. Mask R-CNN, 2980-2988(2017).
[80] Z J HUANG, L C HUANG, Y C GONG et al. Mask scoring R-CNN, 6402-6411(2019).
[81] Y XIANG, T SCHMIDT, V NARAYANAN et al. PoseCNN: a convolutional neural network for 6D object pose estimation in cluttered scenes(2018).
[82] V LEPETIT. BB8: a scalable, accurate, robust to partial occlusion method for predicting the 3D poses of challenging objects without using depth, 3848-3856(2017).
[83] B TEKIN, S N SINHA. Real-time seamless single shot 6D object pose prediction, 292-301(2018).
[84] W KEHL, F MANHARDT, F TOMBARI et al. SSD-6D: making RGB-based 3D detection and 6D pose estimation great again, 1530-1538(2017).
[85] M CAI, T PHAM, I REID. Deep-6DPose: Recovering 6D object pose from a single RGB image(2018).
[86] C LI, J BAI, G D HAGER. A unified framework for multi-view multi-class object pose estimation(2018).
[87] K GUPTA, L PETERSSON, R HARTLEY. CullNet: calibrated and pose aware confidence scores for object pose estimation, 2758-2766(2019).
[88] C WANG, D F XU, Y K ZHU et al. DenseFusion: 6D object pose estimation by iterative dense fusion, 3338-3347(2019).
[89] S D PENG, Y LIU, Q X HUANG et al. PVNet: pixel-wise voting network for 6DoF pose estimation, 4556-4565(2019).
[90] Y S HE, W SUN, H B HUANG et al. PVN3D: a deep point-wise 3D keypoints voting network for 6DoF pose estimation, 11629-11638(2020).
[91] J N SONG, D RONDAO, N AOUF. Deep learning-based spacecraft relative navigation methods: a survey. Acta Astronautica, 191, 22-40(2022).
[92] S SHARMA, S D’AMICO. Neural network-based pose estimation for noncooperative spacecraft rendezvous. IEEE Transactions on Aerospace and Electronic Systems, 56, 4638-4658(2020).
[93] P F PROENÇA, Y GAO. Deep learning for spacecraft pose estimation from photorealistic rendering, 6007-6013(2020).
[94] O KECHAGIAS-STAMATIS, N AOUF, V DUBANCHET et al. DeepLO: Multi-projection deep LIDAR odometry for space orbital robotics rendezvous relative navigation. Acta Astronautica, 177, 270-285(2020).
[95] Y R HUO, Z LI, F ZHANG. Fast and accurate spacecraft pose estimation from single shot space imagery using box reliability and keypoints existence judgments. IEEE Access, 216283-216297(8).
Get Citation
Copy Citation Text
Rui ZHOU, Yanfang LIU, Naiming QI, Jiayu SHE. Overview of visual pose estimation methods for space missions[J]. Optics and Precision Engineering, 2022, 30(20): 2538
Category: Information Sciences
Received: Jan. 17, 2022
Accepted: --
Published Online: Oct. 27, 2022
The Author Email: LIU Yanfang (yanfangliu@hit.edu.cn)