ObjectiveWith the rapid development of complex space missions such as deep-space exploration, on-orbit services, and space debris removal, pose measurement technology for space targets has emerged as a core enabler for autonomous spacecraft operations and mission safety. Accurate pose measurement is directly critical to the successful execution of space missions, highlighting the critical supporting role of this technology in space tasks. As a non-contact measurement method, visual measurement has been widely adopted due to its advantages of low cost, simple installation, and high precision. However, a single visual sensor has significant limitations due to factors such as illumination and measurement range constrains. To enhance mission success rates, multi-sensor joint measurement has become the mainstream solution. The combined use of stereo vision and laser rangefinders can complement each other's advantages, effectively addressing issues such as low initial ranging accuracy, high algorithm redundancy, and weak image information display capabilities, while retaining the benefits of non-contact measurement, fast speed, and easy automation, thus showing broad application prospects in the aerospace field. Calibration between sensors is a prerequisite for ensuring high-precision system measurement. To guarantee the measurement accuracy of the entire system, the relative installation matrix between sensors must be accurately determined, necessitating high-precision joint calibration of the measurement system.
MethodsA joint calibration method is proposed for a multi-modal measurement system composed of stereo vision cameras and a laser rangefinder. The method takes the optical center of the left camera as the origin of the system coordinate system, and uses ZHANG's calibration method to determine the internal and external parameters of the stereo cameras. Line constraints formed by sequential laser spots are employed to calculate the external parameters of the laser emission port in the left camera coordinate system. By integrating the reprojection error of the circular center of a high-precision circular target and the reprojection error constraints of laser spots, the external parameters are optimized to achieve high-precision system calibration (
Fig.6). Aiming at issues such as inaccurate determination of the geometric center of laser spots and unstable calibration results caused by poor imaging quality in existing methods (
Fig.3), a sub-pixel edge positioning method based on local effects is proposed. The algorithm offers significant advantages in terms of time-space complexity and robustness while ensuring extraction accuracy.
Results and DiscussionsThe sub-pixel edge positioning method based on local effects proposed analyzes the grayscale gradient distribution of the local edge of the light spot and calculates the edge vector, which can effectively suppress the influence of noise and reduce the computational load while ensuring accuracy (
Tab.2). The joint calibration employs multiple geometric constraints, avoiding the error accumulation issue of traditional global optimization methods, reducing the impact of single-point outliers on the overall result, and improving the system's robustness (
Tab.3,
Tab.4).
ConclusionsA sub-pixel edge positioning method based on local effects was proposed, which analyzes the grayscale gradient distribution in the edge region of the laser spot to reduce the computational load of centroid extraction while ensuring accuracy. By utilizing multiple geometric constraints formed by the reprojection error of the circular target spot center and the distance error between sequential laser spots, the method minimizes the impact of outliers on the overall result and achieves high-precision joint calibration of the system. Compared with traditional methods, the centroid extraction accuracy of laser spots using this approach is improved by 30%. The overall calibration results are robust, stable, and reliable, providing solid technical support for fields such as pose measurement of space targets.