Significance Optical target simulators, also known as optical scene simulators, are used to generate simulated targets and backgrounds that approximate the optical characteristics of real targets and backgrounds in a laboratory environment. They offer the advantage of flexible and controllable simulation scenes, supporting the construction of various extreme and edge test scenarios. These simulators are widely applied in hardware-in-the-loop (HIL) simulations for optical guidance systems and in the performance testing of various optical imaging systems. In recent years, optical target simulators have also gained attention in the field of autonomous driving, being used for the performance testing of vehicle-mounted optical sensors or participating in HIL simulations for autonomous vehicles. With the development of optical guidance technology, more dimensions of target optical characteristics are being utilized, leading optical target simulation technology to evolve toward the simulation of multi-dimensional optical scenes. This paper reviews the research progress of image-based optical target simulators, multispectral target simulators, and LiDAR target simulators, analyzing the working principles, technical specifications, core components, major research institutions, and the current research status both domestically and internationally. The aim is to help readers quickly understand the relevant knowledge in this field and grasp the trends in technological development.
Progress First, this paper introduces image-based optical target simulators, which include infrared scene projectors, ultraviolet scene projectors, and visible light scene projectors. These image-based optical target simulators are evaluated based on technical specifications such as spectral range, image resolution, temperature range, temperature resolution, non-uniformity, frame rate, and other parameters, all of which directly affect their performance and application outcomes. Taking the infrared scene projector as an example, the paper discusses three mainstream infrared image generation devices: resistor arrays, DMDs (digital micromirror devices), and visible-to-infrared image conversion chips. Resistor arrays offer excellent performance but are expensive, making them the mainstream choice internationally. DMDs, being more affordable, are widely used but suffer from diffraction effects, which impact image quality in the long-wave infrared spectrum. Visible-to-infrared image conversion chips, developed by Beijing Institute of Technology, represent a class of infrared image generation devices that have undergone three generations of upgrades, with the current versions supporting larger array sizes.
Next, the paper introduces ultraviolet and visible light scene projectors. The basic architecture of these projectors is similar to that of infrared scene projectors, with the primary differences being in the choice of light sources and image generation devices. Ultraviolet light sources mainly include halogen lamps, xenon lamps, or deuterium lamps, while visible light sources predominantly use xenon lamps. Image generation devices for ultraviolet scene projectors primarily include DMDs, silicon-based liquid crystals, and liquid crystal spatial light modulators. Visible light scene projectors, on the other hand, primarily use DMDs, LCOS (liquid crystal on silicon), and TFT-LCDs.
Following this, the paper discusses multispectral scene projectors. Compared to image-based optical target simulators, multispectral optical scenes incorporate an additional spectral dimension. Consequently, the technical specifications expand to include spectral range and spectral resolution. A comparative analysis of three multispectral scene projector solutions—developed by Kent Optronics, the National Institute of Standards and Technology (NIST), and Beijing Institute of Technology—is presented, highlighting the differences among these multispectral projectors.
Finally, the paper introduces lidar scene projectors. Compared to image-based optical target simulators, laser 3D scenes incorporate a temporal dimension. The technical specifications are extended to include distance simulation range, distance simulation resolution, distance simulation accuracy, spatial resolution, and depth of field, among others. Three representative technical solutions are described: FLASH lidar scene projectors, lidar scene projectors based on temporal downscaling and integral imaging technology, and scanning lidar scene projectors. The first solution faces challenges in achieving large array scales due to limitations in circuit board design. The second solution addresses this bottleneck by using a small number of delay channels to generate large-array laser delay signals, overcoming the technical limitation of insufficient array size in traditional lidar scene projectors. The third emerging solution, compared to traditional approaches, offers a wider field of view and enables in-situ testing. However, its technical challenges lie in real-time tracking of the measured lidar’s emission beam and the design of spatially large-field optical transmission and reception for laser echo signals.
Conclusions and Prospects Optical scene projectors play a vital role in hardware-in-the-loop (HIL) simulations for optical guidance and autonomous driving, with growing applications in military, civilian, and research fields. This review highlights the principles, features, and applications of conventional, lidar, and multispectral scene projectors. Conventional projectors simulate scenes across multiple wavebands, while multispectral and lidar projectors enhance spectral and depth dimensions for advanced system testing. Future advancements will focus on integrating AI and multidimensional simulations to improve realism and adaptability.