Opto-Electronic Engineering, Volume. 51, Issue 11, 240208-1(2024)

Dual view fusion detection method for event camera detection of unmanned aerial vehicles

Miao Li... Nuo Chen, Wei An*, Boyang Li, Qiang Ling and Weixing Li |Show fewer author(s)
Author Affiliations
  • College of Electronic Science and Technology, National University of Defense Technology, Changsha, Hunan 410073, China
  • show less

    With the widespread application of low-altitude drones, real-time detection of such slow and small targets is crucial for maintaining public safety. Traditional cameras capture image frames with a fixed exposure time, which makes it challenging to adapt to changes in lighting conditions, resulting in the detection of blind spots in intense light and other scenes. Event cameras, as a new type of neuromorphic sensor, sense differences in external brightness changes pixel by pixel. They can still generate high-frequency sparse event data under complex lighting conditions. In response to the difficulty of adapting image-based detection methods to sparse and irregular data from event cameras, this paper models the two-dimensional object detection task as a semantic segmentation task in a three-dimensional spatiotemporal point cloud and proposes a drone object segmentation model based on dual-view fusion. Based on the event camera collecting accurate drone detection datasets, the experimental results show that the proposed method has the optimal detection performance while ensuring real-time performance, achieving stable detection of drone targets.

    Keywords
    Tools

    Get Citation

    Copy Citation Text

    Miao Li, Nuo Chen, Wei An, Boyang Li, Qiang Ling, Weixing Li. Dual view fusion detection method for event camera detection of unmanned aerial vehicles[J]. Opto-Electronic Engineering, 2024, 51(11): 240208-1

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Article

    Received: Sep. 2, 2024

    Accepted: Oct. 12, 2024

    Published Online: Jan. 24, 2025

    The Author Email: An Wei (安玮)

    DOI:10.12086/oee.2024.240208

    Topics