Chinese Optics, Volume. 16, Issue 3, 645(2023)

Lane detection based on dual attention mechanism

Feng-lei REN1,2, Hai-bo ZHOU1,2、*, Lu YANG1,2, and Xin HE3
Author Affiliations
  • 1Tianjin Key Laboratory for Advanced Mechatronic System Design and Intelligent Control, School of Mechanical Engineering, Tianjin University of Technology, Tianjin 300384, China
  • 2National Demonstration Center for Experimental Mechanical and Electrical Engineering Education, Tianjin University of Technology, Tianjin 300384, China
  • 3Changchun Institute of Optics, Fine Mechanics and Physics, Chinese Academy of Sciences, Changchun 130033, China
  • show less

    In order to improve the performance of lane detection algorithms under complex scenes like obstacles, we proposed a multi-lane detection method based on dual attention mechanism. Firstly, we designed a lane segmentation network based on a spatial and channel attention mechanism. With this, we obtained a binary image which shows lane pixels and the background region. Then, we introduced HNet which can output a perspective transformation matrix and transform the image to a bird’s eye view. Next, we did curve fitting and transformed the result back to the original image. Finally, we defined the region between the two-lane lines near the middle of the image as the ego lane. Our algorithm achieves a 96.63% accuracy with real-time performance of 134 FPS on the Tusimple dataset. In addition, it obtains 77.32% of precision on the CULane dataset. The experiments show that our proposed lane detection algorithm can detect multi-lane lines under different scenarios including obstacles. Our proposed algorithm shows more excellent performance compared with the other traditional lane line detection algorithms.

    Tools

    Get Citation

    Copy Citation Text

    Feng-lei REN, Hai-bo ZHOU, Lu YANG, Xin HE. Lane detection based on dual attention mechanism[J]. Chinese Optics, 2023, 16(3): 645

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Original Article

    Received: Mar. 4, 2022

    Accepted: --

    Published Online: May. 31, 2023

    The Author Email:

    DOI:10.37188/CO.2022-0033

    Topics