Laser & Optoelectronics Progress, Volume. 60, Issue 16, 1615010(2023)

Betel Nut Pose Recognition and Localization System Based on Structured Light 3D Vision

Jinmiao Yu1,2 and Jingjing Wu1,2、*
Author Affiliations
  • 1School of Mechanical Engineering, Jiangnan University, Wuxi 214122, Jiangsu, China
  • 2Jiangsu Key Laboratory of Advanced Food Manufacturing Equipment and Technology, Wuxi 214122, Jiangsu, China
  • show less
    Figures & Tables(19)
    Structured light 3D vision sensing system and coordinate system relationship
    Flow chart of betel nut poses recognition and localization
    The process of PSP 3D reconstruction
    Definition of attitude angles of betel nut and four types of betel nut. (a) Definition of attitude angles; (b) normal; (c) over-rolling; (d) upturned; (e) brine-stained
    Flow chart of attitude recognition and positioning algorithm
    Process of pose estimation
    Regional extraction demonstration of the betel nut brine zone
    Cross section of areca nut point cloud
    Examples of 3D reconstruction systems and robot calibration
    Confusion matrices
    Experimental platform
    Three groups of pose estimation experiment
    Three typical postures of the betel nut handling process. (a) Over-rolling; (b) upturned; (c) brine-stained
    Confusion matrix obtained by the proposed pose estimation algorithm
    Experimental process. (a) 3D reconstruction process; (b) acquisition process of actual coordinates of feeding points; (c) betel nuts to be fed
    Analysis of location experimental results. (a) Location of feeding point; (b) errors of X, Y, and Z coordinates
    • Table 1. Detection results obtained by the proposed 3D pose estimation algorithm

      View table

      Table 1. Detection results obtained by the proposed 3D pose estimation algorithm

      IndexWidth /mmCoordinate of the centerHeight of brine /mmPitch /(°)Yaw /(°)Roll /(°)
      16.32(12.87,96.32,13.68)13.68-22.62.716.4
      26.87(39.03,98.17,14.18)14.18-12.701.8
      37.29(66.97,97.83,13.17)13.172.71.27.7
      48.78(92.73,98.79,14.82)14.82-20.100
      57.63(117.29,96.03,13.17)12.67-18.6047.8
      66.48(13.19,33.14,13.74)13.7410.2-2.711.9
      710.17(38.79,36.87,13.19)13.1927.63.614.7
      87.83(67.23,34.76,14.10)14.10-31.82.90
      98.69(92.19,38.78,14.36)14.3652.5-1.80
      107.36(116.89,36.43,13.67)13.6748.70-11.2
    • Table 2. Pose estimation accuracy and work efficiency in 10 groups

      View table

      Table 2. Pose estimation accuracy and work efficiency in 10 groups

      IndexAccuracyTime /sEfficiency number /min
      NormalOver-rollingUpturnedStainOverall
      10.910.851.000.930.920.55103
      20.880.931.000.970.960.5995
      30.970.961.000.990.980.6189
      40.930.941.000.980.950.39136
      50.920.881.000.950.900.57102
      60.900.911.000.950.910.55108
      71.000.891.000.970.940.41121
      80.980.891.000.980.960.5896
      90.950.961.000.950.980.6087
      100.940.931.000.960.950.56108
    • Table 3. Experimental results of the location accuracy evaluation

      View table

      Table 3. Experimental results of the location accuracy evaluation

      IndexCoordinates calculated by the proposed algorithmActual coordinates from pneumatic nozzleError
      XTYTZTXAYAZAXYZ
      Standard deviation0.1570.1490.120
      1(-86.947,676.534,-196.345)(-87.13,676.32,-196.32)0.1830.214-0.025
      2(-60.735,678.341,-195.694)(-60.97,678.17,-195.82)0.2350.1710.126
      3(-33.182,677.875,-196.737)(-33.03,677.83,-196.83)-0.1520.0450.093
      4(-7.496,678.686,-195.323)(-7.27,678.79,-195.18)-0.226-0.104-0.143
      5(17.418,676.243,-196.764)(17.29,676.03,-196.83)0.1280.2130.066
      6(-86.626,613.071,-196.163)(-86.81,613.14,-196.26)0.184-0.0690.097
      7(-61.267,616.75,-196.982)(-61.21,616.87,-196.81)-0.057-0.120-0.172
      8(-32.863,615.005,-195.769)(-32.77,614.76,-195.9)-0.0930.2450.131
      9(-7.746,618.639,-195.835)(-7.81,618.78,-195.64)0.064-0.141-0.195
      10(17.083,616.357,-196.359)(16.89,616.43,-196.33)0.193-0.073-0.029
    Tools

    Get Citation

    Copy Citation Text

    Jinmiao Yu, Jingjing Wu. Betel Nut Pose Recognition and Localization System Based on Structured Light 3D Vision[J]. Laser & Optoelectronics Progress, 2023, 60(16): 1615010

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Machine Vision

    Received: Sep. 30, 2022

    Accepted: Nov. 24, 2022

    Published Online: Aug. 18, 2023

    The Author Email: Wu Jingjing (wjjlady720@jiangnan.edu.cn)

    DOI:10.3788/LOP222667

    Topics