Laser & Optoelectronics Progress, Volume. 56, Issue 2, 023301(2019)

Two-Eye Gaze Tracking Based on Pupil Shape in Space

Xiangjun Wang1,2, Haoyue Bai1,2、*, and Yubo Ni1,2
Author Affiliations
  • 1 State Key Laboratory of Precision Measuring Technology and Instruments, Tianjin University, Tianjin 300072, China
  • 2 MOEMS Education Ministry Key Laboratory, Tianjin University, Tianjin 300072, China
  • show less

    Aim

    ing at the status of immaturity for the human-machine interaction technique of eye gaze tracking, a tabletop two-eye gaze tracking method is proposed based on pupil shape in space of stereo vision. With the low grey value distribution, the pupil center is located preliminarily. The radial derivative polar diagram in pupil area is used to extract the pupil edge point coordinates, and the random sample consensus (RANSAC) is used to fit the pupil edge with a suitable ellipse. The two-eye pupil edge point coordinates are matched using the ORB (Oriented brief) algorithm and the pupil edge point coordinates are obtained based on the two-eye stereo vision model. The least square method is finally adopted to calculate the pupil shape in space and the gaze direction is presented. The experimental results show that the positioning speed of pupil center is 300 frame/s, the two-eye gaze tracking speed is 15 frame/s, and the maximum gaze tracking error is 2.6°. It is verified that the proposed method has good accuracy, robustness and real-time performance, and it can be used in the field of human-machine interaction.

    Tools

    Get Citation

    Copy Citation Text

    Xiangjun Wang, Haoyue Bai, Yubo Ni. Two-Eye Gaze Tracking Based on Pupil Shape in Space[J]. Laser & Optoelectronics Progress, 2019, 56(2): 023301

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Vision, Color, and Visual Optics

    Received: Jun. 29, 2018

    Accepted: Aug. 8, 2018

    Published Online: Aug. 1, 2019

    The Author Email: Bai Haoyue (shun344@qq.com)

    DOI:10.3788/LOP56.023301

    Topics