Laser & Optoelectronics Progress, Volume. 57, Issue 22, 221001(2020)

Tracking of Human Respiratory Motion Based on Kanade-Lucas-Tomasi Algorithm

Xinyu Liu1, Zheng Yan1, Fang Duan1、*, and Zhongying Dai2、*
Author Affiliations
  • 1School of Information Science and Engineering, Huaqiao University, Xiamen, Fujian 361021, China
  • 2Institute of Modern Physics, Chinese Academy of Sciences, Lanzhou, Gansu 730000, China
  • show less

    In this study, an object detection and tracking technology based on the Kanade-Lucas-Tomasi (KLT) algorithm is proposed in this work, which is applied to tracking human respiratory motion during radiotherapy. In the experiment, a human body model with adjustable motion parameters is used to simulate different breathing states of human body, and the image information of motion process is collected by the camera. After a series of image preprocessing such as edge detection and edge enhancement performed on the collected image, the region of interest in the first frame of the image is manually marked, and automatic tracking of the region of interest in the remaining frames of the image is realized through a tracking algorithm. Experimental results verify that the proposed algorithm can accurately realize real-time tracking of the human body surface in different breathing states, and the actual normalized displacement error is less than 0.03. The algorithm can be applied to clinical respiratory motion detection, and the obtained image information and parameters can be used to guide precise radiotherapy.

    Tools

    Get Citation

    Copy Citation Text

    Xinyu Liu, Zheng Yan, Fang Duan, Zhongying Dai. Tracking of Human Respiratory Motion Based on Kanade-Lucas-Tomasi Algorithm[J]. Laser & Optoelectronics Progress, 2020, 57(22): 221001

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Image Processing

    Received: Feb. 5, 2020

    Accepted: Mar. 27, 2020

    Published Online: Nov. 3, 2020

    The Author Email: Duan Fang (nkfetsh@gmail.com), Dai Zhongying (nkfetsh@gmail.com)

    DOI:10.3788/LOP57.221001

    Topics