Laser & Optoelectronics Progress, Volume. 56, Issue 2, 023301(2019)

Two-Eye Gaze Tracking Based on Pupil Shape in Space

Xiangjun Wang1,2, Haoyue Bai1,2、*, and Yubo Ni1,2
Author Affiliations
  • 1 State Key Laboratory of Precision Measuring Technology and Instruments, Tianjin University, Tianjin 300072, China
  • 2 MOEMS Education Ministry Key Laboratory, Tianjin University, Tianjin 300072, China
  • show less
    Figures & Tables(8)
    Top view of two-eye stereo vision system for gaze tracking
    Positioning of pupil center. (a) (b) Without occlusion; (c) (d) with occlusion
    Detection of pupil edge. (a) Eye area; (b) pupil area; (c) polar diagram of pupil; (d) radial derivative of pupil; (e) detection result
    Detection of pupil edge. (a)(b) With near-infrared lighting; (c)(d) without near-infrared lighting
    Feature matching of pupil edge. (a) Left eye; (b) right eye
    Test of gaze points
    • Table 1. Detection errors of gaze points for left eyepixel

      View table

      Table 1. Detection errors of gaze points for left eyepixel

      CoordinateCol 1Col 2Col 3Col 4Col 5Col 6
      Row 14.35.06.29.411.213.1
      Row 22.94.55.88.510.811.7
      Row 33.24.35.58.711.512.2
      Row 44.75.75.610.012.414.1
    • Table 2. Detection errors of gaze points for right eyepixel

      View table

      Table 2. Detection errors of gaze points for right eyepixel

      CoordinateCol 1Col 2Col 3Col 4Col 5Col 6
      Row 112.113.69.48.36.05.0
      Row 211.711.88.56.85.54.3
      Row 311.212.58.76.74.34.1
      Row 415.114.410.29.96.86.7
    Tools

    Get Citation

    Copy Citation Text

    Xiangjun Wang, Haoyue Bai, Yubo Ni. Two-Eye Gaze Tracking Based on Pupil Shape in Space[J]. Laser & Optoelectronics Progress, 2019, 56(2): 023301

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Vision, Color, and Visual Optics

    Received: Jun. 29, 2018

    Accepted: Aug. 8, 2018

    Published Online: Aug. 1, 2019

    The Author Email: Bai Haoyue (shun344@qq.com)

    DOI:10.3788/LOP56.023301

    Topics