Chinese Journal of Lasers, Volume. 51, Issue 15, 1507301(2024)

Applied Research of fNIRS‐BCI for Motion Decision Recognition

Zhuanping Qin1,2, Xinlin Liu1,2, Guangda Lu1,2, Wei Zhang3, Dongyuan Liu4、*, and Feng Gao4
Author Affiliations
  • 1School of Automation and Electrical Engineering, Tianjin University of Technology and Education, Tianjin 300222, China
  • 2Tianjin Key Laboratory of Information Sensing & Intelligent Control, Tianjin 300222, China
  • 3School of Transportation Science and Engineering, Civil Aviation University of China, Tianjin University, Tianjin 300300, China
  • 4College of Precision Instruments and Optoelectronics Engineering, Tianjin University, Tianjin 300072, China
  • show less
    Figures & Tables(10)
    Wearable fNIRS system. (a) Diagram of experimental scene; (b) diagram of channel layout
    Experimental paradigm
    Process diagram of data preprocessing
    Distribution of distinctive features in motion execution
    Schematic diagram of optimized feature selection. (a) Curves of the number of features selected; (b) results of feature selection
    Analysis of model performance. (a) Interactive online brain-computer interface; (b) ROC; (c) accuracy rate test of BCI system
    Distribution of significant features of LHG and RHG in different channels. (a) Channel 4; (b)(c) channel 6
    • Table 1. Performance of classifier before and after feature selection in motion binary classification optimization

      View table

      Table 1. Performance of classifier before and after feature selection in motion binary classification optimization

      Perform taskStatistical preference featureAll time domain feature
      A /%P /%R /%F1 /%A /%P /%R /%F1 /%
      R-LHG (subject 1)0.890.910.870.890.880.880.870.87
      R-RHG (subject 1)0.870.940.790.860.870.940.790.86
      R-LFT (subject 1)0.930.980.880.930.900.930.860.89
      R-RFT (subject 1)0.970.980.960.970.880.930.810.87
      HG-FT (subject 1)0.830.810.860.830.820.810.820.81
      LHG-RHG (subject 1)0.630.660.650.650.540.750.120.21
      LFT-RFT (subject 1)0.550.550.550.550.490.490.470.48
      R-LHG (subject 2)0.910.910.900.900.830.870.760.81
      R-RHG (subject 2)0.860.850.880.860.820.810.820.81
      R-LFT (subject 2)0.840.820.870.840.730.730.740.73
      R-RFT (subject 2)0.860.860.850.850.760.760.760.76
      HG-FT (subject 2)0.960.960.960.960.920.930.900.91
      LHG-RHG (subject 2)0.640.640.640.640.590.590.600.59
      LFT-RFT (subject 2)0.600.580.620.600.540.550.510.53
      R-LHG (subject 3)0.860.910.790.850.820.860.750.80
      R-RHG (subject 3)0.860.870.810.840.860.870.830.85
      R-LFT (subject 3)0.930.910.950.930.880.900.840.87
      R-RFT (subject 3)0.930.960.890.920.860.860.860.86
      HG-FT (subject 3)0.950.940.960.950.900.930.850.89
      LHG-RHG (subject 3)0.630.650.560.600.560.570.520.54
      LFT-RFT (subject 3)0.640.750.570.650.620.610.630.62
      R-LHG (all)0.710.720.700.710.690.700.690.698
      R-RHG (all)0.740.740.730.730.710.730.660.69
      R-LFT (all)0.790.810.760.780.700.700.700.70
      R-RFT (all)0.760.800.700.750.740.740.740.74
      HG-FT (all)0.730.730.740.730.790.790.790.79
      LHG-RHG (all)0.550.550.560.550.510.510.710.59
      LFT-RFT (all)0.550.550.520.530.560.550.570.56
    • Table 2. Performance of classifier before and after feature selection in motion three-classification optimization

      View table

      Table 2. Performance of classifier before and after feature selection in motion three-classification optimization

      R-HG-FTStatistical preference featureAll time domain feature
      A /%P /%R /%F1 /%A /%P /%R /%F1 /%
      Subject 10.78

      0.80 (rest)

      0.87 (task1)

      0.68 (task2)

      0.85 (rest)

      0.76 (task1)

      0.73 (task2)

      0.82 (rest)

      0.81 (task1)

      0.70 (task2)

      0.75

      0.84 (rest)

      0.80 (task1)

      0.65 (task2)

      0.79 (rest)

      0.74 (task1)

      0.73 (task2)

      0.81 (rest)

      0.77 (task1)

      0.69 (task2)

      Subject 20.77

      0.80 (rest)

      0.84 (task1)

      0.69 (task2)

      0.87 (rest)

      0.77 (task1)

      0.68 (task2)

      0.83 (rest)

      0.80 (task1)

      0.68 (task2)

      0.73

      0.82 (rest)

      0.74 (task1)

      0.64 (task2)

      0.84 (rest)

      0.67 (task1)

      0.69 (task2)

      0.83 (rest)

      0.70 (task1)

      0.66 (task2)

      Subject 30.88

      0.93 (rest)

      0.91 (task1)

      0.82 (task2)

      0.85 (rest)

      0.89 (task1)

      0.91 (task2)

      0.89 (rest)

      0.90 (task1)

      0.86 (task2)

      0.80

      0.75 (rest)

      0.89 (task1)

      0.77 (task2)

      0.79 (rest)

      0.81 (task1)

      0.79 (task2)

      0.77 (rest)

      0.85 (task1)

      0.78 (task2)

    • Table 3. Degree of improvement of model accuracy by T-test method and statistical optimization method

      View table

      Table 3. Degree of improvement of model accuracy by T-test method and statistical optimization method

      MethodSubject 1Subject 2Subject 3
      T-test0.0255±0.00250.0193±0.0019-0.0384±0.0032
      Statistical optimization0.0669±0.00200.0730±0.00970.0483±0.0089
    Tools

    Get Citation

    Copy Citation Text

    Zhuanping Qin, Xinlin Liu, Guangda Lu, Wei Zhang, Dongyuan Liu, Feng Gao. Applied Research of fNIRS‐BCI for Motion Decision Recognition[J]. Chinese Journal of Lasers, 2024, 51(15): 1507301

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Neurophotonics and Optical Regulation

    Received: Mar. 4, 2024

    Accepted: Apr. 23, 2024

    Published Online: Jul. 24, 2024

    The Author Email: Liu Dongyuan (liudongyuan@tju.edu.cn)

    DOI:10.3788/CJL240649

    CSTR:32183.14.CJL240649

    Topics