NUCLEAR TECHNIQUES, Volume. 47, Issue 10, 100403(2024)
Research on distributed data acquisition software for high frame rate area detectors
The high frame rate area detector is the core detector for the major imaging-based experimental stations at the Shanghai HIgh repetitioN rate xfel and Extreme light facility (SHINE), and its data throughput is expected to reach more than 20 GB·s-1. For the real-time receiving and processing of tens of GB·s-1 raw data, traditional single-machine systems are difficult to cope with.
This study aims to propose a multi-node distributed data acquisition and processing software architecture for high frame rate area detector at imaging-based experimental stations of SHINE.
Firstly, the performance of different network libraries was investigated, and the synchronous transmission method combined with CPU thread binding was found to have the best single-thread data receiving performance. Then, a parallel event building method was introduced by simultaneously dispatching and combing different module data across multiple nodes based on Bunch ID. Furthermore, the data calibration and the bitshuffle/LZ4 compression algorithm were implemented and tested.
Test results show that the highest single-thread data receiving rate is achieved at nearly 3 GB·s-1, a parallel event building data rate of approximately 23.5 GB·s-1 is achieved by using 4 server nodes, and the realized compression ratio is about 5.7.
The feasibility of the multi-node distributed parallel data acquisition method for high frame rate area detector is verified in this study, providing a foundation for the subsequent development of high-throughput data acquisition software for area detectors.
Get Citation
Copy Citation Text
Kunlin SHANG, Zhengheng LI, Xudong JU, Yue ZHOU, Ping HUAI, Zhi LIU. Research on distributed data acquisition software for high frame rate area detectors[J]. NUCLEAR TECHNIQUES, 2024, 47(10): 100403
Category: NUCLEAR ELECTRONICS AND INSTRUMENTATION
Received: Dec. 30, 2023
Accepted: --
Published Online: Dec. 13, 2024
The Author Email: LI Zhengheng (LIZhengheng)