Chinese Journal of Lasers, Volume. 49, Issue 17, 1704003(2022)
Binocular Vision Position and Attitude Measurement for Key Features of Non-Cooperative Spacecraft
Due to the intensive space launches, the operation of spacecrafts on orbit has been seriously threatened. The on-orbit service has gradually become a hot spot in the field of space research and major aerospace countries have an increasing demand for on-orbit services at present. On-orbit services include tasks such as rendezvous and docking, equipment maintenance and refueling. Non-cooperative spacecrafts refer to the kind of artificial satellites without the design of supporting services in the space environment. In order to tackle the technical difficulties, the key is to continuously obtain the relative position and attitude of the spacecraft so as to complete the tracking and acquisition of non-cooperative target. While the target cannot provide any measurement information, optical measurement can obtain the relative position and attitude information without contacting the target, so that it is the main method of non-cooperative spacecraft measurement. Aiming at the technical difficulties such as poor real-time performance in space non-cooperative target measurement, this paper proposes a binocular position and attitude measurement method for key features of non-cooperative target.
This paper proposes a binocular position and attitude measurement method for key features of non-cooperative spacecraft. Firstly, two visible light cameras are used to obtain high-resolution target images. The satellite and rocket docking ring on the target surface is extracted based on the improved arc-support line segment method. Optical flow assisted tracking in collaboration with dual thresholds is used to determine whether the docking ring ellipse is detected incorrectly. Secondly, a line detection method based on short line feature is proposed to extract reticles of the heat shield. Finally, the target coordinate system is established by using the related features of the docking ring and reticles. The relative position and attitude are calculated in the coordinate system yielded. In order to ensure the consistency of target coordinate system during tracking spacecrafts, a method of switching straight line and docking ring intersection is proposed. The ultimate goal of this paper is to realize the measurement of non-cooperative spacecraft on the embedded hardware platform.
The ground verification system in this paper mainly includes: binocular vision camera, binocular vision processing platform, space robot controller, non-cooperative target satellite model and solar simulation light source, etc. It can be seen from the result images of the candidate ellipse of arc segment combination that the number of candidate ellipses generated by the improved satellite and rocket docking ring method is significantly reduced (Fig. 2), which shortens the time of the ellipse verification step and increases the calculation speed by 74.5% (Table 1). The method of extracting reticles of the heat shield improves the calculation speed by 42% compared with the Hough transform, shortening the running time from 0.558 s to 0.3237 s. Due to the limited shared memory between DSP6678 cores, a complete memory management and allocation mechanism is established in the process of binocular vision algorithm transplantation. For 2048 pixels×2048 pixels input images, the algorithm data update rate can reach 1 Hz. The approaching experiment of the binocular camera system in ultra-close range and the rotation experiment of the non-cooperative spacecraft model are conducted in the darkroom environment. The standard deviation of the three-axis relative position and attitude between the neighbouring image frames is 3σ=(0.058 cm, 0.015 cm, 0.017 cm), and the standard deviation of the three-axis relative attitude is 3σ=(0.680°, 0.116°, 0.101°) in the approaching experiment (Fig. 7). The average value of the measured moving distance is 2.003 cm, which is only 0.03 mm away from the control distance of the robotic arm (Fig. 7). The precision of position is better than 0.7 cm and attitude better than 0.9° in the experiment of rotating the target model around the X, Y, and Z axes (Tables 2-4). The changing rates of position and angle of the three axes are consistent with the actual motion speed, which meets the requirements of space missions.
Focusing on the complicated problem of the position and attitude measurement of unstable non-cooperative spacecraft, this paper proposes a binocular vision position and attitude measurement method for key features of non-cooperative spacecraft. According to the characteristics of the non-cooperative spacecraft, the improved arc-support line segment method is used to extract the ellipse of the satellite and rocket docking ring. The reticles feature extraction of the heat shield is completed by statistical analysis of the main direction and similarity. This method has the advantages of high measurement accuracy and good real-time performance. In the approaching experiment of the binocular camera system, the relative error of the three axes is less than 0.6 mm and the relative angle error is less than 0.7°. In the target model rotation experiment, the relative error of the three axes is less than 6.2 mm and the relative angle error is less than 0.9°. The experimental results demonstrate that this method can output the position and attitude of the target spacecraft in real time during the service in ultra-close range. It is necessary to further optimize the binocular vision algorithm, carry out comprehensive ground verification work and improve the adaptability and stability of the system.
Get Citation
Copy Citation Text
Liang Hu, Huixian Duan, Haodong Pei, Dianqi Sun, An Shu. Binocular Vision Position and Attitude Measurement for Key Features of Non-Cooperative Spacecraft[J]. Chinese Journal of Lasers, 2022, 49(17): 1704003
Category: Measurement and metrology
Received: Dec. 1, 2021
Accepted: Dec. 27, 2021
Published Online: Aug. 9, 2022
The Author Email: Hu Liang (ncuhubery@163.com), Pei Haodong (peihaodong@sina.com)